We’ve talked a lot in Data Literacy in the Real World: Conversations and Case Studies, and in our 2017 and 2018 conferences. Digital assistants like Amazon’s Alexa offer a cool device at an accessible price, but there’s so much more to unpack behind those high-tech cylinders.
Do you want to bare it all to a digital assistant? What happens when we outsource our emotional soothing to a machine? How are today’s devices being coded to reflect the culture of their users? These are some of the questions referenced in “The Quantified Heart,” an essay on Aeon by Polina Aronson and Judith Duportail. Here are a few excerpts to whet your appetite for the entire essay:
[A]n increasing number of people are directing such affective statements, good and bad, to their digital helpmeets. According to Amazon, half of the conversations with the company’s smart-home device Alexa are of non-utilitarian nature – groans about life, jokes, existential questions. ‘People talk to Siri about all kinds of things, including when they’re having a stressful day or have something serious on their mind,’ an Apple job ad declared in late 2017, when the company was recruiting an engineer to help make its virtual assistant more emotionally attuned. ‘They turn to Siri in emergencies or when they want guidance on living a healthier life.’
Some people might be more comfortable disclosing their innermost feelings to an AI. A study conducted by the Institute for Creative Technologies in Los Angeles in 2014 suggests that people display their sadness more intensely, and are less scared about self-disclosure, when they believe they’re interacting with a virtual person, instead of a real one. As when we write a diary, screens can serve as a kind of shield from outside judgment.
Soon enough, we might not even need to confide our secrets to our phones. Several universities and companies are exploring how mental illness and mood swings could be diagnosed just by analysing the tone or speed of your voice … By 2022, it’s possible that ‘your personal device will know more about your emotional state than your own family,’ said Annette Zimmermann, research vice-president at the consulting company Gartner, in a company blog post …
[N]either Siri or Alexa, nor Google Assistant or Russian Alisa, are detached higher minds, untainted by human pettiness. Instead, they’re somewhat grotesque but still recognisable embodiments of certain emotional regimes – rules that regulate the ways in which we conceive of and express our feelings.
These norms of emotional self-governance vary from one society to the next … Google Assistant, developed in Mountain View, California looks like nothing so much as a patchouli-smelling, flip-flop-wearing, talking-circle groupie. It’s a product of what the sociologist Eva Illouz calls emotional capitalism – a regime that considers feelings to be rationally manageable and subdued to the logic of marketed self-interest. Relationships are things into which we must ‘invest’; partnerships involve a ‘trade-off’ of emotional ‘needs’; and the primacy of individual happiness, a kind of affective profit, is key. Sure, Google Assistant will give you a hug, but only because its creators believe that hugging is a productive way to eliminate the ‘negativity’ preventing you from being the best version of yourself …
By contrast, Alisa [a Russian-language assistant] is a dispenser of hard truths and tough love; she encapsulates the Russian ideal: a woman who is capable of halting a galloping horse and entering a burning hut (to cite the 19th-century poet Nikolai Nekrasov). Alisa is a product of emotional socialism, a regime that, according to the sociologist Julia Lerner, accepts suffering as unavoidable, and thus better taken with a clenched jaw rather than with a soft embrace. Anchored in the 19th-century Russian literary tradition, emotional socialism doesn’t rate individual happiness terribly highly, but prizes one’s ability to live with atrocity.
Alisa’s developers understood the need to make her character fit for purpose, culturally speaking. ‘Alisa couldn’t be too sweet, too nice,’ Ilya Subbotin, the Alisa product manager at Yandex, told us. ‘We live in a country where people tick differently than in the West. They will rather appreciate a bit of irony, a bit of dark humour, nothing offensive of course, but also not too sweet’ …
Every answer from a conversational agent is a sign that algorithms are becoming a tool of soft power, a method for inculcating particular cultural values. Gadgets and algorithms give a robotic materiality to what the ancient Greeks called doxa: ‘the common opinion, commonsense repeated over and over, a Medusa that petrifies anyone who watches it,’ as the cultural theorist Roland Barthes defined the term in 1975. Unless users attend to the politics of AI, the emotional regimes that shape our lives risk ossifying into unquestioned doxa …
So what could go wrong? Despite their upsides, emotional-management devices exacerbate emotional capitalism …These apps promote the ideal of the ‘managed heart’, to use an expression from the American sociologist Arlie Russell Hochschild …
Instead of questioning the system of values that sets the bar so high, individuals become increasingly responsible for their own inability to feel better. Just as Amazon’s new virtual stylist, the ‘Echo Look’, rates the outfit you’re wearing, technology has become both the problem and the solution. It acts as both carrot and stick, creating enough self-doubt and stress to make you dislike yourself, while offering you the option of buying your way out of unpleasantness . . .
[I]t’s worth reflecting on what could happen once we offload these skills on to our gadgets.