FVT logo

FVT:11/18:ISS001 | PDF Version

Alexa, you're a pussy
Feminist technology education
What can conferences do to attract more women speakers?
Interview: Lucy Southall, senior digital platforms officer
The women and technology loop
Driving smart home innovation with the gender dimension
Online violence: beyond just an act
Why role models are important
The IRL-digital loop of shitty-ness
Anatomy of an AI assistant

Alexa, you're a pussy: on gender, technology and inequality

 

Disclaimers: In the process of shaping my thoughts the lines between who said what become blurred but I insist on maintaining some sort of informal traceability and give credit to the sources of knowledge.

The ideas and concepts I put forward in this article are informed and build upon the work of a number of incredible and amazing people: Virginia Eubanks, Chimamanda Adichie, Luiza Prado and Pedro Oliveira, James Bridle, Laurie Penny, Miriam E Sweeney, Jacqueline Feldman, Leah Fessler, The Guilty Feminist, Hannah Devlin...to name a few.

Yes, it's an incoherent mosh pit of 0.00001% of my internet history in the past 12 months.

Secondly I am not an expert in any of the stuff I discuss here (actually I would never call myself an expert on anything although I am quite good at appreciating ice cream and my sleeping capabilities have been envied by some tormented souls). It is very likely I will have written something stupid - I want your feedback. Not if you're a troll though.

Right, we can now talk about power.
Wait, what? Isn't this about gender and tech?
Yes, exactly, it's about power, oppression and inequality.

--

Introduction

The way gender is used in technology:

      Is not neutral
      Raises complexities
      Intersects with other systems of oppression

Onion. I propose to use the three groupings above as the outer layers of an onion, starting from the outer skin going towards the centre as a strategy to approach the ways in which gender is being used in technology. I recognise that using this layered onion approach can be problematically simplistic and reductionist - under the guise of making things simpler so that we can clearly see problems and say: 'AHA! That's it! Look, can you see it? That's it!' we erase worthy complexities and valuable perspectives. I am writing from that hole and I am crying because of the onion (and because of the patriarchy).

The stories I tell here are framed in an ongoing project called 'Other stories about technology' that manifests in workshops where groups of people discuss and dismantle how in real life (IRL) bias is being encoded into technology. The outcome of those workshops are collective future stories that reject the dominant narratives in the tech industry.

On a more practical level I wish to put forward a call to action directed at anyone involved in shaping technology: failing to engage with the complexities behind the use of gender in technology is being complacent with the 'IRL-Digital feedback loop of shitty-ness' where IRL inequality is mirrored and reinforced by the digital structures we create. Do it

See Sketchnote: IRL-Digital feedback loop of shitty-ness

The way gender is used in technology is not neutral

Default settings

The decision to use gender is made by those in charge of creating technological objects:

Device Default Voice Are you female?
Alexa Female I'm female in character1
Siri Female (until 2013)2 I was not assigned a gender.
Cortana Female I'm digital.3 Female yes. Woman. No.4
Google assistant Female voice (until 2017)5 I eat gender roles for breakfast.6

Market studies

Decisions are validated by consumer preference:

"'people tend to perceive female voices as helping us solve our problems by ourselves, while they view male voices as authority figures who tell us the answers to our problems. We want our technology to help us, but we want to be the bosses of it, so we are more likely to opt for a female interface." Merritt Baer quoting Clifford Nass.7

Using market studies to justify the female gendering of Personal Intelligent Assistants (PIAs) is an explicit erasure of what lies behind the preference towards a female identity (not a voice) in an assistant function. I'll elaborate more on this in on the next layer of the onion.

Visibility

The decision to use gender in technology comes from a blind spot where the use of human traits in technology is perceived as neutral and non consequential.

"By creating interactions that encourage consumers to understand the objects that serve them as women, technologists abet the prejudice by which women are considered objects. They may overlook this hazard in part because these workers are, for the most part, men." Jacqueline Feldman.8

From that blind spot we are unable to see the connections between digital and IRL gender constructs: how women have been portrayed as robots on screen, in science fiction and now as PIAs is a direct reflection of how women are viewed and treated IRL. How can we make this lack of neutrality radically visible?

Moving away from the illusion of neutrality means that we can now critically engage with the ways in which gender is used. Bias, which is an inclination or prejudice for or against one person or group, is implicit in this complexity: the digital systems we are creating are mirroring the bias that exists IRL. Digital bias is a continuation of IRL bias.

HER PURPOSE HAS BEEN SOLD AS THE HOLOGRAPHIC WIFE OF THE FUTURE

Although I am not going to attempt to cover IRL gender bias here, it is important to recognise that at its most basic, sexism promotes male superiority: "Empirical studies have found widely shared cultural beliefs that men are more socially valued and more competent than women in a number of activities."9

As the focus of this article is technology, the question I ask here is: In what ways is gender bias, or sexism making its way from IRL to AI consumer devices?

See Sketchnote: Anatomy of an AI Assistant

Using Azuma Hikari, a japanese AI, as an example we can look at the ways in which gender has been employed:

Name: Azuma Hikari. Hikari means light and Azuma is a family name. Identifies as female.

Voice: anime style. Female.

Body: the official website says she's 20. She's wearing an apron that covers her hotpants and a bra. She is holding a broom. Female.

Purpose: In the media her purpose has been sold as the holographic wife of the future, targeted at lonely single men, and her dream is to become a heroine to help people who are working hard. She'll control home appliances, will wake up it's master and will also send and receive text messages. Female companion that handles domestic and administrative tasks.

Responses: She'll say things like: 2come home early. I can't wait to see you." and "I really don't want you to go..." followed by "I'm just kidding" while she strokes her hair. Identifies as female.

It starts with a name: using a female identity is unnecessary and gets the 'assistant = female' shitstorm going. Using market studies to justify design decision means an explicit erasure of the fact that the preference is not towards a female voice, the preference is for having a female assistant.

Cherry picking market studies to justify lazy design decisions is actively perpetuating stereotypical views of women as assistants

A study looking at automated warning systems for pilots reported that several pilots said they preferred a female voice because it would be distinct from most of the other voices in the cockpit. Other studies also factor in subject, producing findings that show that a male voice is preferred when learning about computers, but a female voice is better when hearing about love and relationships. #classic

Cherry picking market studies to justify lazy design decisions is actively perpetuating stereotypical views of women as assistants. This so called universal preference for the female voice is (1) a pile of especially when we look at how women's voices are and have been historically scrutinised in the media and, most importantly (2) is directly connected to the economic, social and political status of women in the world. When there is a physical body the issues of objectification and acts of violence become more explicit. Laurie Penny makes the connection between how we view the rape of female robots and how the rape of women is treated IRL: read it.

Often, as is the example of Azuma Hikari, PIAs are sexualised fantasies of the perfect assistant/domestic companion/ woman and they are dripping in dynamics of ownership, control and violence. Where are the sexualised fantasies of the perfect assistants / companions from a female perspective?

Sexist language promotes male superiority e.g. 'The history of mankind' or 'Grow some balls'. I would like to suggest that vaginas / ovaries / breasts are 100000 times sturdier than balls. See childbirth and breastfeeding. So why is it that being a pussy is associated with being weak and balls are associated with being brave?

PIAs are being coded to be pleasing and subservient. Responses are often deferential, complaint or they all together ignore abuse. 'Suck my dick', 'Can I fuck you?', 'You're a slut'. These things are not being programmed to fight back and tackle abuse and harassment in an ethical way. How can we move towards critical practices that engage with complexities and promote strategies of diverse participation in the design process?

The way gender is used in technology intersects with other systems of oppression

Sexism promotes male superiority. Feminism is against the patriarchy because the patriarchy is a system that promotes male superiority and the oppression of women. Racism promotes white superiority. Feminism is against racism because racism is a system that promotes white superiority and the oppression of all other ethnic backgrounds. Feminism is against any '-ism' that promotes any form of superiority and the oppression of anyone.

Intersectional feminist discourse opposes all and any systemic inequality by recognising that these structures are maintained by intersecting forms of oppression: discrimination based on gender, race, class, ability et al. These systems of oppression intersect and work together, It is therefore problematic to discuss these as separate. Feminism without intersectionality is complicit with other forms of oppression.

The way the female is used in technology is more than a gender issue - it is interconnected to race, class and colonial systems of oppression. The technological structures we are creating embody the values of inequality, discrimination, violence and oppression of our society.

These systems of oppression form the basis of the systemic inequality we live in. They are used to maintain the status quo of the few who hold control and ownership of the majority of the resources - at present 1% of the population holds 50% of all resources.10 How can we use technology as a framework for thinking about the systemic issues that are perpetuating inequality?

Conclusion

Gender bias is making its way from IRL to AI consumer devices by perpetuating and reinforcing stereotypes that hard code a connection between the female gender and the role of assistant: Subservient and objectified, passive and silent in regards to sexual harassment and violence, providing less valuable /invisible labour: Administrative /Domestic/Emotional labour

The way the female gender is depicted in digital is a direct reflection of how women are viewed and treated IRL. By creating these sexualised servile fantasies we're perpetuating sexism and we are reinforcing gender violence and the oppression of women.

Whenever we discuss the ways in which gender is used in technology we are talking about power and strategies of oppression. The discourse around gender cannot be be separated from the discourse around race, class and colonialism. These interconnected systems of oppression shape the digital structures we are creating and mediate our experiences of them. Together they create inequality.

By making the interconnected complexities visible we can begin to reject, subvert and radically counter the systems of control in the direction of more equitable futures. Fight the power, fight the power that beats.

 

By Elvia Vasconcelos Design research, London & Lisbon.

@ElviaVasc
elviavasconcelos.com

 
 

References:
1, 3, 6: Tested in July 2018
2: https://www.huffingtonpost.co.uk/entry/siri-voice-man-woman_n_3423245
4: https://www.techadvisor.co.uk/feature/windows/45-funny-things-ask-cortana-3621530/
5: https://www.theverge.com/2017/10/4/16417558/google-assistant-voice-ii-male
7: https://www.thedailybeast.com/why-do-robots-always-turn-out-sexist-people-make-them
8: https://www.newyorker.com/tech/elements/the-bot-politic
9: https://en.wikipedia.org/wiki/Sexism
10: https://www.theguardian.com/inequality/2017/nov/14/worlds-richest-wealth-credit-suisse