Matthew Reidsma

Work Notes

Updates from the GVSU Libraries’ Web Team.
Archive // Subscribe: EmailRSS

The Gendered Chatbot

This weekend, GVSU released a virtual assistant app called myBlueLaker created by the start up n-powered, that allows users to type or speak questions about GVSU to get them information about their University presence (grades, registration information, etc.) as well as logistics (building hours, calendar, etc.) all in one place. I downloaded it this morning so I could make sure all the library related stuff was accurate.

Screenshot of myBlueLaker app

Wait - let’s take a closer look at something:

Screenshot of female assistant in the new myLakerBlue app

Yes, there it is. Another virtual assistant that is gendered as a woman. (And no, you can’t change it to a man or a gender fluid avatar or even Bender from Futurama.)

It’s almost like they updated the icon for the sexist web app that Don Draper might have created, “I Want Sandy,” which shut down in 2008.

Here I’m simply going to address the fact that they chose to make an avatar gendered as a woman to be a tool that helps you with mundane tasks. I’m not even going to get into what they are communicating with the little finger pose the avatar is doing on her face, or the face that “she” is a young white woman—I leave that to someone else.

There has been a lot of scholarship written about how using female gendered AI assistants reinforces negative gender stereotypes, but this also isn’t a new topic in the popular press. (I assume my IT department is not reading scholarship on technology and ethics, even if they should. They are understaffed and too busy, especially these days.) Last year, the U.N. even released a report declaring this gendering of assistants like Siri, Alexa, and Cortana as a real problem, but the big companies didn’t say much in response. (They were too busy swimming in piles of gold, like Scrooge McDuck.)

The gender we assign to a chat bot, a email account, or any non-gendered thing affects how we interact with it. Recently on Twitter, the writer Bess Kalb explained how this gender bias plays out in daily life by sharing a story about her friend: “A friend’s male assistant is a fake email account she runs because people called her “difficult” and “impossible” for having small windows of availability until ‘he’ started running interference and then people just accepted she was … busy.”

In looking at virtual assistants, Loidean and Adams succinctly use Mireille Hildebrandt’s critique of technology on gendered AI assistants, noting “that the technologies we use not only reflect and embed our presumptions and social biases, but also reproduce them in new ways that have material effects on us”1. Writing in PC Mag, Chandra Steele emphasizes that this reflecting our of own biases is why our virtual assistants are women: “Though they lack bodies, they embody what we think of when we picture a personal assistant: a competent, efficient, and reliable woman. She gets you to meetings on time with reminders and directions, serves up reading material for the commute, and delivers relevant information on the way, like weather and traffic. Nevertheless, she is not in charge”2. GVSU’s female avatar for its myBlueLaker assistant continues to reinforce this stereotype: the app is specifically designed to “help” you with routine questions, like where a certain building is or what the library hours are. The problem, Steele reminds us, is that ” when we can only see a woman, even an artificial one, in that position [of assistant], we enforce a harmful culture”.

Many of these companies hide behind UX decisions for these gendered assistants - “the female voice or persona tested best” is a standard response from Cupertino to Redmond. But that’s a weak form of UX, where you toss ethics aside to give users “what they want,” even if what they supposedly want is colored by their own explicit or implicit biases.

Screenshot of n-powered app for Northeastern

The strange part is that n-powered, the company behind the app, clearly does not require us to use a gendered avatar. It’s possible that this was a default avatar no one replaced, but I am not so sure. The sample screen shot (above) on the n-powered website shows a Husky avatar, the mascot for Northeastern University in Boston, where the app started. That means that a decision was made at some point during this process to make the avatar of the GVSU app be a woman. (Or, at the very least, to keep it a woman and change the background to Laker Blue (Pantone 301).)

Why do we not just use the Circle G Logomark? Or Louie the Laker?

I have sent my concerns to the GVSU IT department, along with a brief list of suggested popular and scholarly readings (nothing sways IT departments more than suggestions to read ethics articles, right?) I will post updates as I have them.


  1. Ni Loideain, Nora and Adams, Rachel. Female Servitude by Default and Social Harm: AI Virtual Personal Assistants, the FTC, and Unfair Commercial Practices (June 11, 2019).
  2. Chandra Steele. The Real Reason Voice Assistants Are Female (and Why it Matters). *Pc Mag*. January 4, 2018.