I’ve always felt that the pursuit of companionship is part of being human, and once we find it we feel whole. For many, pets fill this role of companionship: they need us, they are our confidants, they cheer us up in times of bad and they are there to see our times of pride. Pets are friends, extensions of families, body guards, house alarms and alarm clocks. They are also like perpetual children, always in need of us. We take them to the bathroom (or clean the bathroom for them,) we are responsible for feeding them and for keeping their health up. Some of us even clothe them. For those who don’t want to have the responsibilities, or can’t handle the responsibilities, technology has brought us robotic pets.
In the last decade and some change, digital pets have evolved from the simple Tamagotchi. Now, there are robotic pets available that have a “mind” of their own, like the now-discontinued Sony AIBO. Or, a bit more advanced, the Pleo, which can recognize the touch of a person “petting” it and react, and it will be able to hear and heed commands. Of course, there’s also the Furby but it’s hard to imagining it as a positive companion to anyone.
Despite the advances, the sales of the AIBO and Pleo have been disappointing, with the AIBO being axed to make Sony profitable, and Pleo going bankrupt and being sold to Jetta. Much of the reason is the expense that goes into making the robots, with the end-result price tag not looking too attractive to consumers who could opt for an actual pet. But could there be something more to it?
This study from the University of Washington looked at how humans respond to robotic dogs (using AIBO as the example in the study) when compared to stuffed animals or live dogs. It found that when children are given a choice to interact with a live animal or a robotic dog, they will tend toward the live animal and view the robotic dog as mechanic, but when the robotic dog is their only choice (such as in areas where dogs aren’t allowed, like hospitals) they tend to feel the similar emotions with a robotic dog as they would with a live animal. That, I suppose, sounds uncontroversial. We often use what we can to fill emotional voids when it becomes necessary.
One of the interesting things the study did find is that the kids (younger and older) that were surveyed in the study said that it is “Not OK” to hit AIBO, for concern over the robot’s psychological and physical welfare. That jibes with past accounts of empathy felt by humans for robots, like the famous example of Mark Tilden’s stick-insect robot:
At the Yuma Test Grounds in Arizona, the autonomous robot, 5 feet long and modeled on a stick-insect, strutted out for a live-fire test and worked beautifully, he says. Every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.
Finally it was down to one leg. Still, it pulled itself forward. Tilden was ecstatic. The machine was working splendidly.
The human in command of the exercise, however — an Army colonel — blew a fuse.
The colonel ordered the test stopped.
Why? asked Tilden. What’s wrong?
The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.
This test, he charged, was inhumane.
These feelings of empathy and reputability to robots has enabled the pet-bots to be used, with some success, as therapeutic robots for kids and the elderly. Where robots are an endless source of love and non-messiness, the way pets are an endless source of love but with the messiness, it makes sense that they’ve helped both kids and the elderly. As for other adults, the UW study had this to say:
The tendency to anthropomorphize artifacts is easily triggered (Nass & Moon, 2000; Reeves & Nass, 1998). While it remains unclear exactly what features of a robot maximize this tendency, Lee, Park, and Song (2005) found that adults who interacted with a version of AIBO with software such that the AIBO seemingly developed over time, and in response to human behaviors, perceived AIBO as more socially present, than did adults who interacted with a “fully developed” AIBO.
It went on to say that in future studies, the hesitancy of adults to perceive robots like AIBO as “socially present” may disappear as the robots become more autonomous and the software becomes smarter.
With our perceptions of the robot as, suggestively, a living animal, then that raises interesting ethical concerns as we tread down the path of stronger artificial intelligence. Since the definition of “strong A.I.” is to have a machine just as smart or smarter than humans, then the kind of A.I. we program into robotic pets will probably always be less than that, if we can control it.
We often compare the intelligence of other animals or against other animal species — many speak about how “smart” their dogs are, or how smart and cunning a cat is. Although we can say that a pet robot isn’t as smart as our cats and dogs, we aren’t far off from being at a point where they may be — a point that will be reached much sooner than robots surpassing human intelligence.
With that in mind, we also have animal cruelty laws and pet advocacy groups. If we have machines that can “feel,” respond to human commands just like our canine and feline companions can then is that the point when we start considering safety laws for robotic pets as well?
I feel like I may be getting ahead of myself in trying to answer the question, but I can’t see any reason to not consider robotic pet rights in the vein of our animal rights if that’s the path we’re going down. On the other hand, humans and living animals have a complex, long and storied relationship that we don’t have a strong hold on yet, and it has complicated some things like trying to clearly define standards of animal welfare. Taking that into account, it doesn’t seem right trying to define robotic pet welfare when we haven’t come close to squaring the myriad welfare issues of living pets, including shelters, anthropocentric control of animals, breed bans, cruelty laws, so on and so forth.
Let us know what you think about this topic on Twitter @RobotCentral.