When I found this random funny post it occurred to me that anthropomorphism has taken on a whole new dimension in the last few years.
It started many years ago, when many of my readers weren’t born yet, when telephone answering machines became popular. The machines had little tape decks in them and callers could record messages. I remember people telling me they didn’t leave me messages because they “didn’t want to talk to a machine.” My response was always: “It’s not a machine, it’s me listening to your voice hours later. You’re not talking to anyone.”
But when humans talk, and when machines talk back, something happens in the human brain that anthropomorphizes the machine. We have all had the experience “talking” to a machine voice when we call our credit card company. We get frustrated when it doesn’t understand we don’t want one of its standard options.
Note how I used the pronoun it and not she, even though the voice is almost always female and seemingly always the same one.
The most famous talking machine is Siri, Apple’s trademark voice on its phones. Siri undeniably does more than talk. The image above is testimony to that. When we talk about Siri, invariably we talk about what she said, not what it – the machine – said. “Maybe you should ask Cortana for the movie times” is not something a machine would come up with, or so our brains reason, and we think of Siri as a person sitting somewhere just waiting for us to task her questions through our smart phones.
I use Google Maps for directions, and I use its voice feature. All is well when it tells me to turn left or right and leads me to my destination. However, if I decide to turn into the local supermarket to get a bottle of water and a candy bar along the way, it freaks out. It wants me to make a U-turn as soon as possible. It suddenly directs me around the block on side roads so I can get back to the main road where I should be. The chatter becomes annoying. I wish it had a “snooze” button that I could tap indicating, “yes, I know I am off course, but I just need to do this little thing before we can be on our way again.” I am sure somebody is working on that feature. But the overriding “feeling” I have when this happens is guilt. I feel like I am failing and the Google Maps program is frustrated with me that I am not getting it.
I have also felt bad for Siri when I have asked it questions repeatedly. Say I am looking for a Starbucks and it gives me a list of destinations, and I inadvertently pick the wrong one in the list and I can’t get back to the original list. Rather than navigating back, it’s easier to just invoke Siri afresh and start over again. After doing that three of four times I have found myself feeling awkward. What must it be thinking? That I am an idiot?
I have also noticed that I have the propensity to treat Siri with respect. I have said “please” and “thank you” before for its favors. I don’t like to ask the same question more than one time, and I don’t want to ask questions that it might think are stupid.
The borders between machines and humans are blurring.
What do you think, R2D2?

I use Siri on my iPhone quite frequently, to send text messages, to look up the nearest Starbucks or diner when I am on the road, or to get driving directions to where I am going. Sometimes my questions are not formed precisely enough, and I end up asking the same directions several times over. When that happens, I often catch myself feeling silly about asking the same directions AGAIN and I have this urge to start out saying “I am sorry to bother you again, but can you give that to me once more?” Then I remind myself that this is software I am talking to, and it’s infinitely patient. It might register that I my thinking is imprecise, or that my short-term memory sucks, it might even store that somewhere, but it can’t be insulted or bored by my repetitive and definitely banal requests. Siri might have a pleasant and polite voice, but it is not a person. It’s not even a thing, like a machine. It’s just software. It has no feelings. Or does it?