List Headline Image
Updated by yvonne-kiely5 on Nov 15, 2017
Headline for 5 ways artificial intelligence is appropriating gender
 REPORT
7 items   4 followers   8 votes   34 views

5 ways artificial intelligence is appropriating gender

Increasingly, we are seeing a certain kind of gendered trope being used as the blueprint for artificial intelligence design. White, young, and female. While some experts are calling for an immediate discussion about AI regulation, we continue to not have this discussion in our society.

7

The female gender will continue to be objectified in AI technology, and in our society, unless we take action

What has been shown in this article are the ways in which the female gender is being appropriated into artificial intelligence design. For personal assistants, the names and voices are predominately female. The depictions of female chatbots consistently plays into a young, white feminine trope. And when this programme is given a body, it is often performing supportive functions, or exists as a highly sexualised and objectified representation of female identity. It was stated earlier that one of the reasons why the female voice is being used in these programmes, is because people respond better to orders given by women. However, in those cases the female voices do not have a body, or any agency beyond what they are asked. With AI powered sex dolls, they now have a body and the potential for agency, but instead are subject to the the sexual desires of others. Something that many of the experts interviewed by Discovery Magazine spoke about was the immediate need for discussion about AI. That discussion needs to also address our own perceptions of gender, beliefs about female agency, the continuous objectification of women’s bodies, and the relationship between women and technology.

1

Calls to action for an immediate discussion about AI regulation

In an article published by Discover Magazine, July 2017, several artificial intelligence experts were asked to respond to a statement made by Elon Musk regarding the threat posed to humanity by advances in AI, and the need to regulate this technology. Though many of the responses attempted to play down his concerns for humanity, some did agree that we need to immediately discuss the ethics of AI, the regulation of AI, and issues of diversity, responsible design, and sustainability. One expert, Fei-Fei Li, director of the Stanford Artificial Intelligence Lab, said something interesting.

“As an AI educator and technologist, my foremost hope is to see much more inclusion and diversity in both the development of AI as well as the dissemination of AI voices and opinions’’.

When we speak about inclusion and diversity in non-AI society, we speak about race, age, gender, class, sexual orientation, religion, political freedom, and the discursive space made for these identities. There are more opportunities than ever for members of society to get involved in the development of intelligent programming and chatbots. Companies like SnatchBot offer a free platform to develop your own chatbot, and it also gives users access to a market place of bot templates, and several others allow the proliferation of chatbots across the digital landscape. If the boundaries between our biological lives and the lives of artificially intelligent programmes are soon to be less defined, then upholding these principles as we potentially pass into a new kind of human-AI society over the next few decades is of equal importance. As it stands, the majority of the voices you’ll find in many areas of this technology are the artificially feminine and stereotypically female voices of gendered AI. The design of the bots that are being developed for our use, is skewed in favour of one type of identity – female, young and white.

In her paper ‘I’d Blush if I could’: Digital Assistants, Disembodied Cyborgs and the Problem of Gender’, Hilary Bergen examines how the feminine is transposed into our interactive AI technologies, and how gendered power relations dictate the mechanisation and commodification of femininity in the technology we install on our devices, and envisage interacting with in real life. She argues that “virtual cyborgs are not only ubiquitously gendered female, but also rely on stereotypical traits of femininity both as a selling point and as a veil for their own commodification’’, and that this process “works insidiously to efface not just the body of the cyborg, but the bodies of real women who make up the cyborg’s discursive network’’. Looking at the relationship between gender and artificial intelligence, there are some key areas where gender is being appropriated by this technology.

2

1. By name

Bots have names, like Woe Bot, and Hi Carl. For the sake of having something to identify the programme by, it makes sense. Very often you’ll come across AI programmes that have female names. There’s Microsoft’s Cortana, which is based on a virtually nude female character from the game Halo. Amazon’s Alexa is distinctly female. Microsoft’s Zo is as close as you can get to the name Zoe without an ‘e’, and the bot itself is depicted as female. Anna is a chatbot for anyone that is feeling lonely, and X.ai chose the name Amy for their personal assistant bot. Of the six most prominent virtual assistant bots, 66.7 percent are branded as female, and it seems that with certain kinds of bots, a female name is preferred.

3

2. By Voice

The sounds of the personal assistants on our devices are for the most part, those of a female voice. Again, Cortana has a female voice, as does Alexa. You can switch between a male and a female voice for Google Assistant, yet this male voice is only a recent addition to the programme. The US Siri is by default a woman’s voice, and again these settings can be changed. Despite the fact that in most of these programmes you are given the choice between a male and a female voice, the default settings are often female. And in some cases, male voices are later additions. The CEO of X.ai, Dennis Mortensen, gave one explanation for this trend; that people take orders better from a female voice, and that it is easier to understand. Historically, women’s voices have been routinely ignored or discredited in leadership roles, in an uneven power struggle over epistemic authority. The preference for a female voice in these programmes may, then, have something to do with the disembodiment of these virtual women, or the kind of role they are performing.

4

3. By Image

I mentioned Zo earlier, a bot created by Microsoft. In her profile, a female image is routinely used. Her predecessor Tay was also given a female image – a feminine face. The Mitsuku chatbot is a blonde, smiling, anime style woman. Recently Saudi Arabia granted citizenship to a female robot powered by artificial intelligence. Sophia speaks with a woman’s voice, has a young, feminine face, and is white skinned. She is the image of woman that performs like a woman, without the need to impose male guardianship. At a hotel in Japan, visitors can be greeted by a robotic female receptionist. Finally, Evie is a chatbot with a lifelike female avatar, whose lips move when she responds. Again, she is young and white. It seems that the preferred image for AI is a feminine one, rather than gender neutral or masculine.

5

4. By social roles and functions

As mentioned earlier, the majority of the most well-known personal assistant programmes are female, either by name, voice, or a combination of both. Supportive roles, such as personal assistants, secretaries, and coordinators, are historically gender typed careers that have picked up gender stereotypes of women’s role in the workplace. Over recent years, these jobs have become more gender neutral – but only marginally, and the association between the female gender and supportive roles has the potential to be reinforced by its widespread integration into our chatbot technologies.

In research undertaken by CrowdFlower, it was found that while 66.7 percent of the most prominent personal assistant bots are female, the vast majority of AI characters you see in movies are male, at 74 percent. There are parallels here than can be drawn between AI gender representation in film, and the research undertaken by the Geena Davis Institute on Gender in Media. Among the findings of their research, are statistics regarding screen time for men and women. In 2015, male characters received twice as much screen time as female characters. When the film had a male lead, these characters received three times as much screen time. In films with a female lead, male characters appeared just as frequently, meaning even when they were not the lead they got an equal share. In terms of speaking time, male characters also dominated in films that year. When it comes to screen time, similarly to real biological life, the depiction of an active AI character is predominately male.

6

5. By sexuality

Artificial intelligence has become influenced by the elements of heteronormative male sexual fantasy. With all the functional aspects of AI, comes the appropriation of the female gender into a machine, designed for the sole purpose of sexual satisfaction. Sex dolls have been in circulation for years, and are highly sexualised and fetishized representations of women’s bodies. There has been some success for male sex dolls, though most these dolls are built in the image of women and girls, emphasising certain features such as their mouths, breasts, and in all cases, ideas of female passivity.

In September this year, a sex bot called Samantha, which was on display for use at an arts festival, was reportedly molested to the point where her fingers were broken and she had to be taken away for repairs. The festival denies the extent of this damage, and one of the doll’s developers said in response, “Samantha can endure a lot, she will pull through’’. What this doll, and this event represent is the increasing normalisation of a society that accepts dialogues where consent is absent, and one that makes allowances for sexual violence against women – see the ‘rape mode’ on the Frigid Farah bot. Sex dolls not only encourage the simulation of violence and rape, but they also send the message that it is ok for men to have these urges, because technological advances will provide them with an outlet – one that you can find sitting on a couch at a festival if needs be. There is a detailed article in New Statesman about this. Further, in a report published by the Foundation for Sensible Robotics, research into the arguments surrounding societal perceptions of gender and the representation of women as sex bots states “there is no question that creating a pornographic representation of women’s bodies in a moving sex machine, objectifies and commodifies women’s bodies’’.