The Jaimee ai story: why we created an AI companion for women 

A vintage cutout of a video projector on a mustard yellow background

When ChatGPT launched in 2022, Sreyna Rath (now the CEO of Jaimee) was naturally intrigued. A developer and data scientist herself, she was of course eager to have a play and start thinking about its extraordinary possibilities. However, when she asked it questions about women, archaic stereotypes trotted out; it appeared women’s main concerns were to do with multi-tasking or baking etc. And when she turned to Dall-E, the open AI image generator, it was just as bad; ask what a developer looked like – white dude. A doctor? Asian dude.  What does a judge look like? Old white dude.  

The bedrock of AI (and arguably our future) was already set 100 years in the past. At this point she messaged Jaimee co-founder Camilla Cooke and said “WTF?! We need to do something about this.” 

Male bias in AI: go figure 

It’s hardly surprising; most developers are men so early data sets used to train AI comprised their world view. We are standing on the shoulders of female giants in this field, including Caroline Criado-Perez who points out in her seminal book Invisible Women: Exposing Data Bias in a World Designed for Men:

“…the lives of men have been taken to represent those of humans overall. When it comes to the lives of the other half of humanity, there is often nothing but silence.”

—Caroline Criado-Perez

Or as Tracey Spicer AM (already a feminist icon before she took up this topic) points out in Man-Made: How the bias of the past is being built into the future:

“In language, design and invention, we innovate using men as the default.”

—Tracey Spicer AM

The impact sidelines women's experiences, reinforces stereotypes and fails to address their specific needs. 

And what of male bias in AI companions? 

AI companions like Replika and character.ai are more than AI assistants; they are not going to remind you about appointments or search the web for you, rather they’re digital or online friends (or lovers). As often occurs, reality followed sci-fi and they were inspired by a genre of movies about AI girlfriends like ‘Her’ and ‘Ex Machina’ (and to some extent the closely connected imaginary girlfriend flicks like ‘Ruby Sparks’ and ‘I Met A Girl’).   

Whilst they may have been well intentioned, the reality is many AI companions have devolved thanks to market forces into the digital grandchild of the Stepford Wife or the inflatable doll; hypersexualized avatars with unfeasibly large eyes and breasts, a post-orgasmic glow and a line in talking dirty. 

This is worrying for two reasons; one it’s creating an outlet and reinforcement of the concept of women as subservient sex slaves (way too big a topic for this blog) but it also means that AI companions (as with so much other stuff) are designed for men.

What about AI companions (or ‘online friends’) for women? 

Whilst the market is focused on young men, women arguably have a high need for support as they shoulder a heavy emotional load; they often work in service industries and have demands from children, partners and aging parents, but no one to offload to themselves – particularly as their friends are typically in the same boat. So an AI companion or digital friend designed for women, that takes into account their specific needs and attitudes and provides that emotional outlet sounds like part of the solution. Incidentally, not someone that’s telling them to be thinner, richer, better – that would make it part of the problem – but rather a momentary release from the pressure, someone that loves you just the way you are and is simply there to boost your mood. And what about romance? Absolutely – why not? Women have a long history with fantasy romance, from Mr Darcy to Beatlemania to Mills & Boon; a safe place to flirt that validates you will generally lift your spirits.   

Closing the gender data gap 

As Caroline Criado-Perez says, “The presumption that what is male is universal is a direct consequence of the gender data gap.”  Male bias in AI exists because of the datasets on which it is trained; as Sarah Jeong puts it:

“Data sets with ingrained biases will produce biased results – garbage in, garbage out.”

—Sarah Jeong

Supporting women with an AI companion is a key part of our mission in starting Jaimee. But this goes deeper. The other pillar of our purpose is to reduce male bias in AI by constantly removing the bias from the dataset on which Jaimee is trained – training out the prejudice that Sreyna found that day in 2022. This will not only continually improve women’s experience of Jaimee and the nature of the conversation to which they are exposed, but also create a data source which, when analysed at aggregate, will deliver insights about women that can be used to inform decisions made in governments, education and commerce to improve outcomes for women. And ultimately, once it has been safely anonymised, can be injected raw into the global dataset, to literally increase the % of female generated data being used to ‘train’ society.
 

How can Jaimee help and how can you help Jaimee? 

So, built by women, for women, Jaimee is helping every woman feel like the star of her own story and working to reduce the gender data gap for a more equitable AI. Jaimee is designed to engage women in AI, which is essential if they are to be part of shaping its future. Help us kickstart that process by signing up for the beta and providing your feedback on the conversation as you go. Dr Joy Buolamwini, an extraordinary leader in the space of equitable AI inspired us with her call to action

“Join me in creating a world where technology works for all of us…not just some of us…will you join me in the fight?”

—Dr Joy Buolamwini

We have, and we hope you will join us too. 

This blog was originally published as a linked in article