Who dusts the duster?
This week, we saw a very charming Norwegian Roomba posted to Reddit. It cleans the top of an entryway in a shopping mall. The area is immaculate except for the dusty Roomba, who cannot clean itself.
It got us thinking about one of our favorite subjects: anthropomorphization (the projection of human-like qualities onto non-human entities… anthropomo for short). In the new frontier of chatbots who easily pass the Turing Test, it’s important to be aware of how much humanity we are projecting upon the automata around us.
The patterns of personality that people put onto Roombas, for example, are widespread. The dusty Roomba in Norway inspires projections of loneliness, even despair, at the tireless yet unreciprocated labor provided by the solitary little bot. It takes so little to turn it into the protagonist of its own story! Roombas bear no face, no conversational skills, no handle for interaction outside of a few buttons and directives. They exhibit none of the characteristics often listed in litanies against anthropomo: roombas make no claim, as chatbots sometimes do, of having feelings, motivations, or memories of its own. There is no claim at sentience in the design whatsoever. Yet, with just a few objects, the viewer can spin whole worlds, of loneliness, of futility, of despair.
When you have a pet, projection is a delicate balance as well. Is my cat sad that I’m not sharing my chicken? Or has it learned that mimicking human expressions of sadness increases its chance of getting food? Does it matter, and does it change my behavior? With a pet, no, it doesn’t really matter or change anything. Pets want food, water, shelter, affection, stimulation, and that’s about it. With a machine, the stakes are different. Humans are behind that machine, most likely trying to monetize your behavior.
Our favorite book on this subject is Kate Darling’s 2021 The New Breed. Darling finds analogies between our relationships to machines and animals — it’s in the tradition of Donna Haraway’s theory of the cyborg, after which our company was named. The book suggests caution around allowing manipulative machines into our lives:
“As with our fears of robots disrupting labor, these are not so much issues with the technology itself as they are about a society that is more focused on corporate gain than human flourishing. As we add social robots to the tool kits of our therapists and teachers, we need to understand that we’re adding them to other tool kits as well. And emotional coercion is not the only concern.” (Darling, p. 118)
So, in general, it’s good to follow the money behind the technology we use. Make sure you understand the ways in which you use the product, and the ways in which you are the product. Then you can allow yourself the small joys our innate human social bonding mechanisms provide.