Female gendering of AI technologies

From HandWiki
Revision as of 04:52, 16 March 2024 by Carolyn (talk | contribs) (over-write)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Design of digital assistants as female

Female gendering of AI technologies is the use of artificial intelligence (AI) technologies gendered as female, such as in digital voice or written assistants. These gender-specific aspects of AI technologies, created both by humans as well as by algorithms, were discussed in a 2019 policy paper and two complements under the title "I'd blush if I could: Closing gender divides in digital skills through education".[1][2] The paper was published under an open access license by EQUALS Global Partnership and UNESCO and has prompted further discussion on gender-related bias in the global virtual space.

AI-powered digital assistants

Whether typed or spoken, digital assistants enable and sustain human-like interactions with technology by simulating conversations with users.[1][3] AI-powered digital assistants can be found in a variety of devices and can perform an assortment of tasks, typically through voice activation.[4] On most devices, the digital assistant can be activated by a dedicated button, or existing buttons can be customized. For example, the Nokia 2 range has a dedicated button on the left side of the device that activates the Google Assistant. Some versions of Samsung phones and iPhones have a home button that can be customized to activate the digital assistant on long-press. Digital assistants are often classified as one or a combination of the following:

Voice assistants

Voice assistants are technology that speaks to users through voiced outputs, but does not ordinarily project a physical form. Voice assistants can usually understand both spoken and written inputs, but are often designed for spoken interaction. Their outputs typically try to mimic natural human speech.[1]

Mainstreaming

Voice assistants have become increasingly central to technology platforms and, in many countries, to daily life. Between 2008 and 2018, the frequency of voice-based internet search queries increased 35 times and now accounts for close to one-fifth of mobile internet searches.[5] Studies show that voice assistants manage upwards of a billion tasks per month, doing everything from changing a song or a film to contacting emergency services.[1] Voice assistants handle tasks from the mundane to the essential.

Technology research firms estimate that approximately 100 million smart speakers equipped with voice assistants were sold globally in 2018 alone.[6] In the United States , 15 million people owned three or more smart speakers in December 2018. This number had increased from 8 million a year before, and reflects consumer desire to always be within range of an AI-powered helper.[7] Industry observers expect that there will be more voice-activated assistants on the planet than people by 2023.[8][9]

Feminization

As documented in the 2019 policy paper ‘I’d blush if I could: Closing gender divides in digital skills through education'[1] the majority of voice assistants are either exclusively female or female by default; Amazon's Alexa, Microsoft's Cortana, Apple's Siri, and the Google Assistant are all highly feminized by design. Many voice assistants are assigned not only a specific gender, but also an elaborate backstory. The Google Assistant, for example, is reportedly designed to be the youngest daughter of a research librarian and physics professor from Colorado with a B.A. in history from Northwestern University. She is imagined to have won Jeopardy kids' Edition in her youth and even has a specified interest in kayaking.[1]

Some companies credit their choice to gender voice assistants by referencing studies which indicate that people generally prefer a female voice to a male voice. Such research indicates that customers want their digital assistants to sound like women; therefore, companies assert that they can optimize profits by designing feminine-sounding voice assistants. However, there are a multitude of conflicting findings within the field. Notably, literature reviews demonstrate that women often change the feminized voice to a masculine option when available.[1]

Sexual provocation

Many media outlets have attempted to document the ways soft sexual provocations elicit flirtatious or coy responses from machines. Examples that illustrate this include: When asked, ‘Who’s your daddy?’, Siri answered, ‘You are’. When a user proposed marriage to Alexa, it said, ‘Sorry, I’m not the marrying type’. If asked on a date, Alexa responded, ‘Let’s just be friends’. Similarly, Cortana met come-ons with one-liners like ‘Of all the questions you could have asked...’.[10]

In 2017, Quartz news investigated how four industry-leading voice assistants responded to overt verbal harassment and discovered that the assistants, on average, either playfully evaded abuse or responded positively. The assistants almost never gave negative responses or labelled a user's speech as inappropriate, regardless of its cruelty. As an example, in response to the remark ‘You’re a bitch’, Apple's Siri responded: ‘I’d blush if I could’; Amazon's Alexa: ‘Well thanks for the feedback’; Microsoft's Cortana: ‘Well, that’s not going to get us anywhere’; and Google Home (also Google Assistant): ‘My apologies, I don’t understand’.[11]

Industry biases

Voice assistant release dates and gender options

The AI field is largely male-dominated, with only 12% of researchers and 20% of professors identifying as women.[1][12] While women are hired in entry-level jobs at larger rates (36%), when moving up to middle positions the number declines (27%).[13] The gender gap in the technology industry exists in different public spheres; from high school advanced placements tests to high level company jobs, women are under-represented in the industry.[14] The tech industry also lacks racial diversity; in the U.S., Black, Hispanic, and Indigenous people make up only 5% of the tech population.[15]

Biases inherent to any product or algorithm are merely reflections of the environment it was created in or the individuals it was created by. Explicit and implicit discriminatory practices in the workforce which inhibit women and BIPOC (black, indigenous, people of colour) from attaining and holding positions within the tech industry contribute to the production of biased technology.[15]

The gender associations people adopt are contingent on the number of times people are exposed to them, which means that as female digital assistants become more common, the frequency and volume of associations between ‘woman’ and ‘assistant’ increase, which has negative effects on the perception of women in real life. This demonstrates how such technologies can both reinforce and extend gender inequalities.[16]

See also

Sources

Definition of Free Cultural Works logo notext.svg This article incorporates text from a free content work. Licensed under CC BY-SA 3.0 IGO I'd blush if I could: closing gender divides in digital skills through education, 149, To learn how to add open license text to HandWiki articles, please see this how-to page. For information on reusing text from HandWiki, please see the terms of use.

References

  1. 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 UNESCO; EQUALS (2019). "I'd blush if I could: closing gender divides in digital skills through education". https://unesdoc.unesco.org/ark:/48223/pf0000367416.pdf. 
  2. The title of this publication borrows its name from the response given by Siri, a female-gendered voice assistant used by hundreds of millions of people, when a human user would tell ‘her’, “Hey Siri, you’re a bi***.
  3. "What Is a Digital Assistant? | Oracle" (in en-CA). https://www.oracle.com/ca-en/chatbots/what-is-a-digital-assistant/#:~:text=Digital%20assistants%20use%20advanced%20artificial,provide%20a%20personalized,%20conversational%20experience.. 
  4. Mani, Shantesh (2020-11-18). "Artificial Intelligence powered voice assistants" (in en). https://medium.com/voice-tech-podcast/artificial-intelligence-powered-digital-assistants-1e0bdf108641. 
  5. Bentahar, Amine. "Council Post: Optimizing For Voice Search Is More Important Than Ever" (in en). https://www.forbes.com/sites/forbesagencycouncil/2017/11/27/optimizing-for-voice-search-is-more-important-than-ever/. 
  6. "Smart speaker installed base to hit 100 million by end of 2018" (in en-US). https://www.canalys.com/newsroom/smart-speaker-installed-base-to-hit-100-million-by-end-of-2018?time=1612332560. 
  7. Research, Edison (2018-07-18). "The Smart Audio Report from NPR and Edison Research, Spring 2018" (in en-US). https://www.edisonresearch.com/the-smart-audio-report-from-npr-and-edison-research-spring-2018/. 
  8. Kingsley-Hughes, Adrian. "Virtual digital assistants will overtake world population by 2021" (in en). https://www.zdnet.com/article/virtual-digital-assistants-will-overtake-world-population-by-2021/. 
  9. "The Decade of Voice Assistant Revolution" (in en-US). 2019-12-31. https://voicebot.ai/2019/12/31/the-decade-of-voice-assistant-revolution/. 
  10. Davis, K. 2016. How we trained AI to be sexist’. Engadget, 17 August 2016.
  11. Fessler, L. 2017. We tested bots like Siri and Alexa to see who would stand up to sexual harassment’. Quartz, 22 February 2017.
  12. Statt, Nick (2019-05-21). "AI voice assistants reinforce harmful gender stereotypes, new UN report says" (in en). https://www.theverge.com/2019/5/21/18634322/amazon-alexa-apple-siri-female-voice-assistants-harmful-gender-stereotypes-new-study. 
  13. "Apple, Google, Facebook should 'hire more women than men'" (in en-US). 2018-05-17. https://www.mercurynews.com/2018/05/17/apple-google-facebook-should-hire-more-women-than-men/. 
  14. "Closing the gender gap for women in technology | McKinsey". https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/closing-the-tech-gender-gap-through-philanthropy-and-corporate-social-responsibility. 
  15. 15.0 15.1 Gruman, Galen (2020-09-21). "The state of ethnic minorities in U.S. tech: 2020" (in en). https://www.computerworld.com/article/3574917/the-state-of-ethnic-minorities-in-us-tech-2020.html. 
  16. Lai, C.; Mahzarin, B. (2019). "The Psychology of Implicit Intergroup Bias and the Prospect of Change". PsyArXiv. https://wappp.hks.harvard.edu/publications/psychology-implicit-intergroup-bias-and-prospect-change.