There are at least two sides to this question, so let’s start with the first: Who is more likely to get attached to a robot? First and foremost, attachment to an object is usually caused by anthropomorphizing the object. We ascribe human emotion to it, and so reciprocate with our own feelings. While these factors may apply to objects in general, they are still helpful in the robot realm.
For the purposes of our discussion, this factor is very limited. Still, there’s a valuable point in why autistic people (especially those with Asperger’s syndrome) are more likely in particular. Autistic people find it difficult to connect with others, because they are unable to express emotions in the same way. That doesn’t mean they are devoid of emotion, as they can become extremely attached to specific people, and even to objects. They understand how social conventions work, but often cannot apply them in social settings. With an object, however, they are free to impose whatever characteristics they choose upon the object, and can readily react to these imagined personalities (Marty). This leads us to the second factor.
“When someone suffers from ‘social deficits’ (i.e., loneliness), he’s more likely to grow attached to possessions” (Tuttle). When we’re deprived of human companionship, we attribute human characteristics to objects, and then associate with the object instead of actual people. Also, the temporal and monetary investment in caring for the robot can eventually transfer to emotional investment.
3. A Desire to be in Control
“Because material objects are not sentient beings, without consciousness and free will, such objects offer consumers relatively predictable and controllable—albeit one-sided—relationships. Therefore, those exhibiting this trait seem more susceptible to possession love” (Tuttle). Robots are usually designed to assist people, so we may think of them as subservient. As a reward for their hard work, we’ll be nice to them.
4. We understand humans better than anything else
Because we are only human, it’s naturally difficult to try to imagine how other beings might feel, especially objects. “When dealing with something unpredictable — a computer on the fritz, a sour economy — we might feel totally disconnected. ‘One way to make sense of it is to treat it like something familiar, which is the human form'” (Pappas). What sort of emotions or intelligence might a robot have? We don’t know, so we assume it would have human emotions and intelligence, if anything. And when robots behave in a way that we might not understand, we attempt to explain it from a human perspective (Epley, Waytz, and Cacioppo 866).
The second side to this question is: What kind of robots are we more likely to get attached to?
1. Anthropomorphic Robots
When we don’t know how to describe the behavior of a robot, we resort to human terms to define it. But what if the robot already looks very much like a human? Or what if it acts like a human? Or what if it sounds like a human? All the better for us, isn’t it? The more a robot is like a human, the more we treat it like a human (and the more we expect from it too)(Nauert).
2. Robots who seem to have agency
What about the robots that are less on the anthropomorphic scale? We’ve got love for them too! There’s a well-known study about Roombas, the robot vacuum that moves on its own.
A Georgia Tech researcher has found that many Roomba owners name, dress up and genuinely worry about their Roombas, as if they were living pets…Others in the study were found to rearrange their homes to be more accommodating to the robots, while others pre-clean their homes before putting the machine to work and buy new rugs that don’t tie up the Roomba on its programmed march around the house. One subject even introduced his Roomba to his parents. —Dillow
While the intensity of attachment varied from person to person, two-thirds of the participants at least named and gave their Roomba a gender.
Another study found that soldiers also became attached to the robots they worked with on the training field, even expressing sadness (and sometimes holding funerals) if their robot was destroyed. Researchers are worried that this might affect soldiers’ judgment on the battlefield; they might hesitate to send their robot on a fatal mission (Armstrong).
The interesting part in both these cases is that people expressed the most emotion when their robots were hindered or unable to function. For robots that appear to have a sense of agency, it is when they are stopped that we care most. The study about the Roombas suggested that “their reliability as robots is less important than their presence and companionship” (Dillow). (Even the phones and laptops and tablets we currently own, we’re only really concerned for them when they break down.) Perhaps this reaction comes from our ancient instinct to help others in the community in order to survive, because we need their presence.
Technicity and Robotics
Last week, Professor Cartwright suggested that “To understand being in the world, we should look to technology in its operative functioning with humans-nature.” What can we learn from all these characteristics? Basically, humans have a need for social interaction, even those (especially those, in fact) who seem to be the most secluded. And when we can’t participate in human interaction, we’ll project human characteristics onto objects in order to get the interaction we crave. The current trend in robots is to make them more anthropomorphic, suggesting that we will identify technology even less as a medium and more as an entity in its own, or a companion.