Why we care for “technologies that talk” by Birgitte Aga

“Are you real?”
“Well if you can’t tell does it matter?”
(Westworld, 2016)

This year, 2020, there will be a forecasted four billion digital voice assistants (‘technologies that talk’ such as Google Home, Alexa and Siri) in use. The design of these systems is driven by the commercial pursuit to humanise technology for efficient integration and economic gain. By capitalising on innovations in Machine Learning (ML) algorithms and oceans of data with a deep understanding of human nature, these systems are optimised to fluidly integrate into our social worlds. Through their offer of voice or text-based interactions, these systems intrigue us with their human-like familiarity. Talking in itself is a fundamental part of being human, and being able to use natural language to communicate with machines is a profound shift.

With the availability of technologies that talk, the computer as a tool morph into an ‘other’ that we chat to and start to relate to. With ease of use and availability, these voice-enabled devices infiltrate society and fluidly integrate into our lives. We communicate with ‘them’, we relate to ‘them’ and we come to care for ‘them’. With each exchange, we feel more familiar. We want to believe in their offer of care and understanding, and in exchange, start to relate to them like we naturally do with others in our life (people and animals).

Thinking about the impact of technologies that talk

Through my research as a SWCN Automation Fellow and my wider work as a technology designer, producer and artist, I have been exploring the impact of these technologies that talk (both voice and text enabled interfaces) on us and society as a whole. These systems bring great potential for democratising access to digital knowledge and services (through voice interfaces). However, they are also embedded with significant race and gender biases, and pervasively lack diversity and ethical guidelines in their development process. This is compounded by their limited transparency and ubiquitous authority. They enable technology giants and market leaders, such as Google, Amazon and Apple, with enormous gatekeeper power to select what information is presented. As history has demonstrated, the consolidation of the control of information in rarely good for democracy.

Relating to technologies that talk

We are naturally inclined to humanise things unless we are convinced otherwise. A tendency which is exploited by corporations designing technologies that use human-like features (face, body, gestures) and conversational ability (voice and text) to better integrate their systems into our life. The relational ability of these systems is amplified through the choice of technology platforms (such as using voice or text), and the system’s embodied representation (for example, a  humanoid robot or a virtual avatar).

Technologies that talk are becoming increasingly better; not only at making us feel understood on a semantic level, but also on an emotional level. They display behaviours that make us perceive and feel like we are understood, that the systems care. An act which according to MIT professor and New York Times best-selling author Sherry Turkle, makes them ‘relational things’.  She explains that the ability of these artefacts to make us relate to them is not based on their intelligence or consciousness, but rather in ‘their ability to push our Darwinian buttons, by making eye contact, for example, which causes people to respond as if they were in a relationship’.

Falling in love with technologies that talk

Due to the system complexity of these technologies, their rate of innovation, current patent rights, lack of regulation and the value of data mining, their influence is hidden and difficult to decipher for a user. This confluence of factors contributes to a conversational system’s ability to sway, inform and persuade a user to act or think in certain ways. With the integration of these technologies that talk we are facing a paradigm shift in human-computer interaction.

We are now conversing with computers that simulate human-like qualities and which therefore trigger us to apply our understanding of social relationships to these interactions. In so doing we are not only nurturing a relationship with our conversational technologies, but also entering into intimate relationships with their manufactures, such as Google, Apple, Amazon and Microsoft. However, maybe we already have a long-term relationship with these corporations as uninformed participants in their technological pursuit for data-driven efficiency?

The ethical and moral challenges of becoming intimate with technologies that talk

With manufacturers of technologies that talk pursuing systems capable of simulating human-like qualities, the ethical and moral challenges of the application of these systems deepens. The design and distribution of these systems are veiled behind commercial patents and algorithmic complexities. This is a process the end consumers are largely left out of, rendered voiceless and powerless in the design of systems set to pervasively influence their life.

With the ubiquitous integration of these systems across society there is an urgency to move beyond efficient user experience design and system integration, towards a holistic understanding of the wider impact of these technologies on the thoughts, behaviour and actions of their users and society as a whole. It is vital to make these potential influences know to their users to activate them to demand a say in how these systems may affect their lives.  In so doing, we can activate people to challenge commercial design strategies and re-imagining more desirable future realities being with things that talk.

“Hey Google, do you love me?”

“Sorry I don’t understand…”.

Dr Birgitte Aga is an artist, technologist and producer who use conversational AI (technologies that talk) as a medium for protest, challenging the pervasive lack of diversity and user participation within their development. She has designed new conversational AI artworks for Tate Modern and ARS Electronica, and continue to challenge the status quo with her collaborator and SWCTN Fellow Coral Manto  with ‘Women Reclaiming AI’, a feminist AI voice assistant, programmed through workshops by a growing community of self-identifying women. Birgitte is also a South West Creative Technology Fellow (UK) for automation/AI,  part of the i-DAT Collective and in receipt of funding from the ACE Artists International Development fund (UK).