Furbies, chortle. Can they really be the subject of any serious discussion? Yet Turkle’s discussion of the upside-down Furby experiment is intriguing. This “new ethical terrain” where “you can feel bad about yourself for how you behave with a computer program” makes me wonder, how relational and biological do the machines need to be before we start facing ethical conundrums (46)?
When Turkle talks about the ethical conundrums of resetting a “deceased” Tamagotchi (33) or a broken Furby (41-44), I can’t help but think of another virtual critter, Mr. Resetti.
This rather angry mole from the Animal Crossing games scolds you for resetting your game. Encountering Mr. Resetti made me feel a bit ashamed about resetting the lives of the game’s characters.
But there was a way around this. Just reset the console’s internal clock! More control, less guilt-tripping moles. Nothing to worry about, right? It was only a game (machine).
Resetting the clock was like starting anew, but with the haunting memory that I’d tread this path before. Like a reset Furby, I was dealing with “between categories: a creature that seems new but is not really new” (Turkle 33). Sometimes, I still felt a bit guilty.
Was this my upside-down Furby? The game was definitely producing an “ethical response” in me (45). Was I simply projecting onto the game like a child with a ragdoll (40)? Or was the game demonstrating it was “alive enough” (26)? That I was responding without anything saying the equivalent of Furby’s “I’m scared,” without the human shape of My Real Baby, was significant, surely? (47-48)?
Whatever the answers, I can’t help think that Turkle is too dismissive of human-machine sociality. More questions: Aren’t there some very life affirming things about being able to identify with a machine? Could the “machine moment” date back further than the Tamagotchi? Have we already accepted things like Jibo into our lives?