From Bitching Betty to Amazon Alexa:
The Female Voice in Virtual Assistant Technology


Interactive essay (larger device and sound on recommended)


By 2021, more people will have smartphones than access to clean drinking water. Smart devices are predicted to outnumber humans 3 to 11. Virtual assistant and artificial intelligence technology are rapidly developing, making their way into the everyday lives of billions. While this technology undoubtedly makes our lives easier – a personal, pocket-sized assistant setting our meetings, calling our mother back, and finding the perfect recipe – the feminization of these bodiless ‘beings’ is interesting to consider in our contemporary world. While the deep societal expectations of men and women are slowly changing across the globe, are Siri and Alexa – and their trillion-dollar developers – subconsciously reinforcing gender stereotypes? Is this feminization of our assistants, ones that are designed for the domestic sphere, exacerbating confronting shortcomings in our social sphere?

Like so much of our everyday technology, the automated voice systems built into our smart devices were initially developed by the military. During World War II, the voice warnings of fighter airplanes was, and continues to be, predominantly female, with the (overwhelmingly male) U.S. aircrew nicknaming ‘her’ Bitching Betty2. Dr. Heather Roff, a research scientist specializing in military A.I. at the Johns Hopkins University’s Applied Physics Laboratory, believes this preference for female voices stemmed from the male crew’s relationships with their wives3, ones informed by their mid-20th century context and gender expectations. Historically, women have been considered the ‘homemakers’, in charge of domestic duties, childrearing, and serving their husband. The late Stanford communications professor Clifford Nass posits, “People tend to perceive female voices as helping us solve problems by ourselves, while they view male voices as authority figures who tell us the answers to our problems. We want our technology to help us, but we want to be the bosses of it, so we are more likely to opt for a female interface”4. Dr. Roff believes this collective, subconscious understanding of the female voice – and the societal expectations of women – has carried over the decades, from Bitching Betty in the cockpit to Amazon Alexa in our living rooms.

Gender is an incredibly complex topic, and while the term ‘gender’ is often used interchangeably with ‘sex’, it is important to distinguish the two. Sex is the biological characteristics determining whether an individual is male or female, while gender is the social meanings assigned to those biological factors – “the way women are treated because they are perceived to be female”5. While societal expectations of both genders have been shifting since the feminist movement of the 1960s, 75% of unpaid labor is still carried out by women6. Feminist author Caroline Criado-Perez points out, “The term ‘working woman’ is a tautology. There is no such thing as a woman who doesn’t work. There is only a woman who isn’t paidfor her work”7. A recent study found that women’s unpaid labor is worth US$10.9 trillion (more than the combined 2018 earnings of the 50 largest companies in the world)8, with women in every nation spending more time doing unpaid labor than men on average. It is important for both developers and consumers to be acutely aware of this context in which their gendered virtual assistants exist. Does having a male voice for our banking applications, and a female voice for our ‘home’ devices, reinforce gendered understandings of place, roles, and belonging?

In an effort to challenge gender bias, and simultaneously recognize nonbinary folks in technology, a Copenhagen-based group of linguists, technologists, and sound designers developed Q, “the first genderless voice”9, in 2019. The voices of two dozen individuals, identifying as male, female, and nonbinary, were collected and rated by 4,600 people on a scale of 1 (male) to 5 (female). From there, the Q-group identified a gender-neutral frequency range at around 145 – 175 Hz10. Casper Klynge, Danish Tech Ambassador, argues that technology companies “need to take a societal responsibility which is proportional to the kind of influence they exercise”11. With the development of gender-neutral possibilities in tech, hopefully larger companies will begin to understand the harmful implications of their gendered virtual assistant voices.

While both the male pilots and female voice actors consider the name Bitching Betty to be affectionate, the word ‘bitch’ is highly gendered and has a complex history. Linguist and author Amanda Montell argues, “Every part of our speech – our words, our intonation, our sentence structures – is sending invisible signals telling other people who we are. How to treat us. In the wrong hands, speech can be used as a weapon”12. She outlines the evolution of the term ‘bitch’, highlighting the pejoration of so many words used to refer to women. Believed to have derived from the ancient Sanskrit bhagas (meaning “genitals”, free of gendered specifics), “the word narrowed to female animal, and [eventually] landed on female dog.” The term jumped from beast to human in 1400 AD, “when ‘bitch’ surfaced in writing to describe a promiscuous woman or prostitute.” From there, its meanings have evolved to encompass “a sort of weakling or servant; a stuck-up, mean, unpleasant woman; and finally a verb meaning ‘to complain’”13 (however, since the 1990s, there has been a reclamation and redefinition of ‘bitch’ by women in hip-hop). While the pilots may see their nickname as endearing, it is undeniable that the highly gendered language reflects much deeper societal attitudes, setting a precedent of how to both treat and address women, particularly in situations of power imbalance. Perhaps the pilots’ perspective towards the faceless Betty – both a supervisor and a compliant servant – have subconsciously informed our current dynamics with our contemporary virtual assistants, in turn shaping our interpersonal relationships.

One of the most famous virtual assistants in today’s world is Apple’s Siri. Born in 2011, Siri translates to “a beautiful woman who leads you to victory,”14 and while its voice settings can be changed, the default is female. A variety of other A.I. assistants emerged soon after, including Amazon Alexa, Google Home, and Microsoft’s Cortana (named after a nude character in the video game Halo). Even the first chatbot program, developed in the 1960s, was named Eliza. All are acutely gendered, with female voices, polite tones, and demure, slightly flirtatious ‘personalities’. Some developers argue that the use of female voices in A.I. systems is born purely out of functionality: high-pitched voices are easier to hear and smaller speakers cannot reproduce low-pitched voices as well. Others believe they are simply reimagining the legacy of the traditional female assistant and telephone operator. However, there is an overwhelming amount of evidence that proves “the human brain is developed to like female voices.”15 They are perceived as warmer, gentler, and more helpful, and thus the feminization of virtual assistants drives profitability. Developer Alexa Steinbrück agrees: “Personality and gender in voice assistants is not something that emerges due to the nature of ‘A.I. systems.’ It is intentionally created based on the logic of market demand, gender biases, and prevalent unrealistic narratives about A.I.”16

Until 2019, Siri responded to “Hey Siri, you’re a bitch,” with “I’d blush if I could,”17 while at the time of Cortana’s debut in 2013, “a good chunk of the volume of early-on inquiries” were into ‘her’ sex life.18 The disturbing instinct to harass these feminized virtual assistants reveals frightening underlying societal issues when it comes to the way people discuss and address women. This is further exacerbated by the perception that these female assistants are subservient, locating the user in a position of power. Users can hurl offensive slurs at their devices, their disrespect met only with (an undoubtedly feminized) compliance, perpetuating a toxic cycle of abuse that can easily break away from the digital sphere to the interpersonal, ‘real world.’19 What kind of attitudes and behavior towards women does this encourage and propagate?

While the feminization of contemporary virtual assistant technology is grounded in psychology, the power dynamics established between user and device raise important questions concerning our perceptions and understanding of gender and place in society. While the developers at Apple, Amazon, Google, and Microsoft cannot be accused of outright sexism and prejudice, it is important for them to consider the complex context in which their bots exist, the invisible and subconscious beliefs they can play into, and the unintended yet detrimental attitudes they can exacerbate. This entire discussion leads into one that is much larger and affecting real change, from the boardroom to the Senate to the living room. A greater understanding and respect for the differing experiences of men and women is starting to inform policies, designs, technology, and therefore, society. The belief of the male experience as the default is shifting, and women’s experiences, needs, and voices are (literally) beginning to be heard.



Notes

1. Cisco, Cisco Annual Internet Report (2018 – 2023), (San Jose: Cisco, March 9, 2020), accessed April 10, 2020, www.cisco.com/c/en/us/solutions/collateral/executive-perspectives/annual-internet-report/white-paper- c11-741490.pdf.

2. The name Nagging Nora was and is similarly used by the British.

3. Cristen Conger & Caroline Ervin, hosts, “How to Reboot Sexist Robots,” Unladylike (podcast), July 3, 2018, accessed April 10, 2020, https://unladylike.co/episodes/019/robots?rq=robots.


4. Jessi Hempel, “Siri and Cortana Sound Like Ladies Because of Sexism,” Wired, October 28, 2015, accessed April 10, 2020, www.wired.com/2015/10/why-siri-cortana-voice-interfaces-sound-female-sexism/.

5. Caroline Criado-Perez, Invisible Women: Exposing Data Bias in a World Designed for Men (New York: Random House, 2019), 12.

6. Ibid., 48.

7. Ibid.

8. Gus Wezerek & Kristen R. Ghodsee, “Women’s Unpaid Labor is Worth $10,900,000,000,000,” The New York Times, March 5, 2020, accessed April 10, 2020, www.nytimes.com/interactive/2020/03/04/opinion/women- unpaid-labor.html.

9. “Meet Q,” Meet Q: The First Genderless Voice, accessed April 25, 2020, www.genderlessvoice.com/.

10. The average adult male voice frequency is between 85 – 180 Hz, while the average adult female voice falls between 165 – 255 Hz.

11. “Watch,” Meet Q: The First Genderless Voice, accessed April 25, 2020, www.genderlessvoice.com/watch.

12. Amanda Montell, Wordslut (New York: Harper Wave, 2019), 7-8.

13. Ibid., 29-30.

14.  Hempel, “Siri and Cortana.”

15. Clifford Nass quoted, Brandon Griggs, “Why computer voices are mostly female,” CNN, October 21, 2011, accessed April 10, 2020, https://edition.cnn.com/2011/10/21/tech/innovation/female-computer- voices/index.html.

16. Alexa Steinbrück quoted, Madeleine Morley, “What Would a Feminist Alexa Look, or Rather Sound, Like?” Eye on Design, March 18, 2020, accessed April 10, 2020, https://eyeondesign.aiga.org/what-would-a-feminist- alexa-look-or-rather-sound-like/.

17. Michael J. Coren, “It took (only) six years for bots to start ditching outdated gender stereotypes,” Quartz, July 26, 2017, accessed April 10, 2020, https://qz.com/1033587/it-took-only-six-years-for-bots-to-start-ditching- outdated-gender-stereotypes/.

18. “Deborah Harrison, Editorial Writer, Cortana,” filmed January 2016 at the Re-Work Virtual Assistant Summit, San Francisco, CA, video, 28:09, www.youtube.com/watch?v=-WcC9PNMuL0.

19. It is important to note that our vernacular distinguishing ‘digital’ from ‘real’ worlds is problematic and further implies that our digital interactions are somehow ‘not real’, while only our face-to-face are.



I respectfully acknowledge the traditional custodians of this land on which I work, learn and live – the Gadigal people of the Eora nation – and pay my respects to Elders past and present.

© 2024 Nina Szewczyk All rights reserved