Welcome back to Neural Notes, a column where I look at some of the most interesting AI news of the week, In this edition I chat with the co-founders of Jaimee, a new AI companion built for and by women that is part of the 2024 Techstars Sydney accelerator.
Jaimee vs the AI gender data gap
Jaimeeโs origins are rooted in a fundamental problem: AI systems. The underlying data tends to skew towards males due to the datasets they are being trained on, and who theyโre being built by.
Itโs something we have discussed previously with Girl Geek Academy, which is trying to implement โAI Highโ in the school curriculum for similar reasons. Weโve seen problems in the past around tech that has been built with a distinct lack of diversity.
Considering the sheer speed and scalability of AI โ a lack of diversity in datasets and in those building AI will be โ and already has been โ a huge problem.
Sreyna Rath, the co-founder and CEO of Jaimee, stumbled upon this issue firsthand. While experimenting with ChatGPT, she noticed a troubling pattern: even simple queries led to stereotypical responses.
“If you go into ChatGPT and ask it for a female point of view, itโll give you something about baking or multitasking,” Rath said to SmartCompany.
“So I then went down a rabbit hole on this gender gap that exists. It’s a bias that exists in AI, and it’s very prevalent because AI is built by men.
It was clear to Rath and co-founder and CMO Camilla Cooke that if AI was going to be truly inclusive, it needed to be trained on more diverse data setsโparticularly those reflecting womenโs experiences.
“I saw that there is a need for us to get women involved in using and building AI,” Rath said.
Jaimeeโs goal is twofold: to provide a useful, engaging AI companion for women and contribute to closing the gender data gap by gathering anonymised, women-centric data.
The idea that Jaimee will get smarter and more responsive as it learns from the interactions it has with its users is a controversial one.
How AI is trained is a hot-button topic — and with good reason. It has been misused repeatedly in the recent past — with lawsuits being brought against big tech companies. And businesses that arenโt transparent about how their AI models are trained have come under fire.
From Cooke and Rathโs perspective, they are hoping to build a robust dataset that can benefit other systems down the line.
“Eventually we would want to look to give or provide that service to non-governmental organisations (NGOs) or not-for-profits, especially femtechs to help with what women are talking about with regards to [for example] perimenopause,” Rath said.
According to Cooke, they would also consider commercialising this data further down the track. This approach isnโt without its challenges, particularly in an industry where privacy concerns are, understandably, constantly being talked about.
The founders say that having more female datasets being utilised in the world is a positive thing — particularly when it comes to building systems and decision-making.
They also say they willย be transparent about how they handle user data. Every conversation with Jaimee is anonymised, with any personally identifiable information (PII) scrubbed.
“We’re going to build that into the privacy policy and everything upfront so that people [know] we’re really transparent. But I think you take away a lot of fear as long as it’s anonymised,” Cooke said to SmartCompany.
“I think that what’s critical is that your data may be sold, but your PII won’t be.”
However, the founders are also aware that while Jaimee is being built with women in mind, it is open to and welcoming of anyone, including men.
One key concern is male users could skew Jaimeeโs training data by engaging in ways that donโt reflect womenโs experiences.
The idea is to prevent the kind of interactions that might feed into the very biases Jaimee is trying to eliminate.
There is also the possibility it could be used to perpetuate the same stereotypes or even abuse that other AI have been subjected to.
Cooke acknowledged this, explaining, “That is partly where the guardrails will come up. In addition to safety guardrails, we will be doing specific work on… misogynistic and sexist guardrails… we need to keep the model as pure as we can.”
Emotional support, but with boundaries
According to its founders, Jaimee is designed to be a friend, a mentor โ and, for those who want it, even a lover. This includes the ability to choose between several ‘Jaimee’ avatars.
While the notion of an AI “lover” might raise eyebrows or conjure thoughts of the movie Her, Cooke makes the point that it’s already a firmly established category.
“Part of our motivation for creating one for women is that [AI companions] have been dominated particularly by young men. It’s a hyper-sexualised category, and it’s certainly contributing to a lot of the sexism that AI is generating and perpetuating,” Cooke said.
“The point about Jamie is that it’s an option — it’s a friend or lover.
“But we do not think itโs controversial for women to want a romantic relationship they can control,” Cooke said.
Rath and Cooke say Jaimee allows women to explore their romantic and emotional needs on their terms, with an AI that can adapt and respond to those needs.
But itโs not just about romance. The broader goal is to offer women a safe space where they can express themselves, without fear of judgment or societal pressures.
Of course, this opens Pandoraโs box of ethical concerns, particularly around the idea of over-reliance on AI for emotional fulfilment.
The founders are well aware of these potential pitfalls, and thatโs where Jaimeeโs aforementioned boundaries โ or “guardrails” โ come in.
These built-in mechanisms are designed to monitor interactions and prevent abusive behaviour or inappropriate use or misuse of the AI.
Despite its role in offering emotional support, Jaimee isnโt positioning itself as a mental health tool. The founders say while Jaimee may help users navigate stress, relationships, and confidence, itโs not a substitute for professional care.
They have also been in talks with clinical psychologists as they navigate this area.
“It is very important for us to make sure we are operating safely,” Cooke said.
“This is not a mental health product. It’s there for fun, to make you smile, to make you laugh, or to help you escape from the stress. It’s that momentary release from pressure.”
Jaimeeโs role is to provide companionship and a sense of comfort, but the founders are mindful of the boundaries between emotional support and therapy.
For users who may express signs of distressโwhether itโs anxiety, depression, or more serious issuesโJaimee will offer built-in responses, including suggestions to seek professional help. Itโs soft touch, designed to direct users toward appropriate resources without overstepping the AIโs limitations.
Early days for Jaimee
It’s worth noting Jaimee is roughly a year old and still in the beta stage. Rath and Cooke are still testing and refining the features as well as getting additional experience and guidance as part of Techstars.
While it’s still early days, the team is optimistic about Jaimeeโs potential to become a meaningful player in the AI space. However, they also acknowledge the road ahead is full of challenges — from ensuring user privacy to maintaining ethical standards around emotional support.
But for Rath and Cooke, the bigger picture is clear: Jaimee represents a significant step forward in closing the gender data gap in AI.
By gathering women-centric data and building an AI that genuinely reflects womenโs experiences, they hope to set a new standard for how AI can serve its usersโwithout perpetuating the biases of the past.
“If you create one for women, you create the opportunity to generate a female data set…the opportunity to address the gender data gap by getting them involved and actually shaping the future of AI,” Cooke said.
Other AI news this week
- OpenAI has rolled out its Advanced Voice Mode feature to ChatGPT Plus and Team users. With the new mode, you’re able to speak a little more naturally with the AI, including interrupting midsentence. It is also said to be able to pick up on your emotions based on your tone of voice and change the way it responds in reaction to that.
- Speaking of OpenAI, its press account was just hacked on X to promote a crypto scam. This is said to be the fifth cybersecurity incident the company has had since January 2023.
- 100,000 people bought the R1 Rabbit AI device… but only 5000 of them are still using it daily. Ooft.
- Microsoft has released Correction — a service that’s part of the Azure AI Content Safety APIย — that is supposed to revise AI-generated text that is factually incorrect. But there is some scepticism around it.
Never miss a story: sign up toย SmartCompanyโsย free daily newsletterย and find our best stories onย LinkedIn.
Comments