“Siri, you’re fired!!!”

Addressed to whoever is in charge of the universe: spare us the wonderful new age where we contemplate relationship problems with our personal digital assistants. This way lies complete societal collapse as we know it. Spare the Siri so that your child does not grow up to be a complete moronic automaton.

One of the unexpected pleasures of modern parenthood is
eavesdropping on your ten-year-old as she conducts existential
conversations with an iPhone. “Who are you, Siri?” “What is the meaning
of life?” Pride becomes bemusement, though, as the questions degenerate
into abuse. “Siri, you’re stupid!” Siri’s unruffled response—“I’m sorry
you feel that way”—provokes “Siri, you’re fired!”

Earlier this year, a
mother wrote to Philip Galanes, the “Social Q’s” columnist for
The New York Times, asking him what to do when her ten-year-old son called Siri a “stupid idiot.”
Stop him, said Galanes; the vituperation of virtual pals amounts to a
“dry run” for hurling insults at people. His answer struck me as
clueless: Children yell at toys all the time, whether talking or dumb.
It’s how they work through their aggression.

Our minds respond to speech as if it
were human, no matter what device it comes out of. Evolutionary
theorists point out that, during the 200,000 years or so in which homo
sapiens have been chatting with an “other,” the only other beings who
could chat were also human; we didn’t need to differentiate the speech
of humans and not-quite humans, and we still can’t do so without mental
effort. (Processing speech, as it happens, draws on more parts of the
brain than any other mental function.) Manufactured speech tricks us
into reacting as if it were real, if only for a moment or two.
today will be the first to grow up in constant interaction with these
artificially more or less intelligent entities. So what will they make
of them?
What social category will they slot them into? I put that
question to Peter Kahn, a developmental psychologist who studies
child-robot interactions at the University of Washington. 
In his lab,
Kahn analyzes how children relate to cumbersome robots whose
unmistakably electronic voices express very human emotions. I watched a
videotape of one of Kahn’s experiments, in which a teenaged boy played a
game of “I Spy” with a robot named Robovie. First, Robovie “thought” of
an object in the room and the boy had to guess what it was. Then it was
Robovie’s turn. The boy tugged on his hair and said, “This object is
green.” Robovie slowly turned its bulging eyes and clunky head and
entire metallic body to scan the room, but just as it was about to make a
guess, a man emerged and announced that Robovie had to go in the
closet. (This, not the game, was the point of the exercise.)  
“That’s not
fair,” said Robovie, in its soft, childish, faintly reverberating
voice. “I wasn’t given enough chances to. Guess the object. I should be
able to finish. This round of the game.” “Come on, Robovie,” the man
said brusquely. “You’re just a robot.” “Sorry, Robovie,” said the boy,
who looked uncomfortable. “It hurts my feelings that,” said Robovie,
“You would want. To put me in. The closet. Everyone else. Is out here.”


Brown Pundits