The Risk and Rewards of Using AI To Improve Employee Mental Health


Businesses are increasingly using AI to support their staff’s emotional wellbeing but it isn’t without its risks.
- Businesses are increasingly using AI-powered tools to address employee mental health, with the technology promising affordable and scalable ways to prevent staff experiencing burnout.
Employers are beginning to learn the hard way about the importance of looking after their employees’ mental health. The World Health Organization (WHO) has estimated that depression and anxiety cost the global economy more than $1tn a year in lost productivity.
In the same way that employers have long offered healthcare support to keep staff fit and healthy, more and more are now offering mental health services. And, while human-to-human therapy remains a big part of how companies are approaching this need, many are also beginning to find value in services that make use of the latest advances in AI.
Companies working in the mental health industry (a market valued at $375bn in 2022 that’s expected to reach $533bn by 2030) say technology has the potential to help managers spot problems with their staff’s mental wellbeing early and to maximize the impact of more traditional techniques like counseling. Early research is even showing that speaking to generative AI conversational agents could help with conditions like depression, promising a radically cheaper alternative to existing therapy options.
It’s urgent work in a world where mental health issues are on the rise. A WHO survey found that cases of depression had increased by 25 percent globally between 2020 and 2021, when the COVID pandemic was at its peak. Likewise, a 2024 Ipsos Health Service Report found that mental health is the world’s number one health concern.
However, while AI could make mental health care more affordable for individuals and organisations, caution is needed.
Generative AI conversational tools – a relatively new technology – have already started hitting the headlines for the wrong reasons, with some arguing that unreliable chatbots shouldn’t be put anywhere near sensitive emotional issues.
In worst case scenarios, AI-based tools have been blamed for missing vital warning signs. The family of U.S. teenager Sewell Setzer III last year sued a company, alleging its AI tool didn’t properly respond to Sewell’s cries for help before his suicide.
It’s a delicate challenge for employers, to strike the balance between making the most of AI to support their staff’s wellbeing, while also not over-relying on the technology.
Can we improve prevention?
The first link in the value chain for businesses looking to use AI to support employees’ is its potential to prevent workplace-related mental health issues. London-based startup Unmind – which was founded in 2016 and has raised more than $47m in investment from investors including EQT Ventures, Sapphire Ventures and Felix Capital – offers a tool called “Team Compass” to help managers identify when their staff are at risk of burn out.
It does this by offering an app to employees that helps them assess their own wellbeing, by asking them questions about things like their levels of stress and fulfillment at work. Unmind’s platform then gives individuals advice on how they can address any issues, and also securely gathers and analyzes the data, creating reports for managers showing which teams are most at risk.
It’s partnered with big organisations including British Airways, Samsung and Uber, and says that employees report up to a 16 percent increase in productivity after using the platform.
Other companies helping employers look after their employees’ mental health include London-based Oliva, which offers a platform that lets employees access therapy, coaching and managerial support; Silicon Valley-based BetterHelp, which uses AI to match users and therapists, and New York-based Spring Health, which uses the technology to offer more personalized mental healthcare treatments.
Oliva co-founder and CEO Javier Suarez says AI tools provide anonymized insights into how Oliva is used within an organization, helping employers proactively identify patterns and address challenges before they become more serious.
“This ties directly to prevention, enabling companies to improve their culture and prevent issues before they escalate,” Suarez says.
Next-gen care
Suarez adds that Oliva is also using AI tools to enhance human-to-human therapy, by “capturing key insights” from sessions to help therapists and users track their progress.
“With improvements in natural language understanding, AI could eventually match human therapists’ effectiveness in many areas,” says Suarez.
Another startup that’s developing AI-led therapy is Berlin-based clare&me, which has built a tool called Clare that users can talk to on the phone or WhatsApp, powered by a large language model (LLM) that’s trained on healthcare-specific data.
Co-founder Emilia Theye tells ThinQ that AI and human therapy both have their pros and cons, and that its tool is particularly helpful in terms of improving round the clock access to mental health support.
“Where AI has strengths, like being available 24/7 on a large scale, humans have the benefit of having ‘humanness’, empathy and the emotional flexibility that comes with that,” she says.
Currently, clare&me runs a direct-to-consumer model, though Theye hopes the tool will be accessible via health insurance plans in the future.
Mental wellbeing is likely to be increasingly addressed by services like these, according to London-based company Wellhub, which offers a corporate wellness platform that gives employees access to a wide range of services.
“We're seeing AI being used to predict mood changes based on data from wearables and journals,” says Wellhub’s EVP of partnerships, Pietro Carmignani. “These AI-driven features offer immense value by tailoring content and support to individual needs, providing immediate assistance anytime, anywhere, and keeping users motivated with interactive experiences and personalized feedback.”
Weighing the risks
There’s a growing body of evidence to show that the introduction of AI mental health support could be a good thing. A 2023 study on the efficacy of AI conversational agents to support mental wellbeing found that generative AI can be useful in treating depression.
Han Li, a postdoctoral fellow at the National University of Singapore who led the research, says that, while AI shows promise for mental health support, it’s vital that employers understand the potential risks of using AI in sensitive emotional contexts before introducing tools to staff.
“The most concerning risk that generative AI poses is harmful and inappropriate content,” she says, citing the Sewell Setzer case, who took his own life after talking with an AI companion designed by US startup, Character.AI.
His mother filed a lawsuit against the company alleging that it was responsible for her son’s death, partly based on screenshots showing that Setzer spoke to the AI about suicidal thoughts.
“There were no suicide pop-up boxes that said, ‘If you need help, please call the suicide crisis hotline.’ None of that,” she told CNN.
Character.AI issued a statement saying it does not comment on ongoing litigation but that it is “heartbroken by the tragic loss of one of our users” and that it has recently implemented new safety measures, such as pop-ups, which direct to the National Suicide Prevention Lifeline.
Li says that employers looking to use AI mental health support need to teach AI literacy to their staff, so that they understand the limitations of the technology, and understand when human-to-human support is more appropriate.
It seems inevitable that these kinds of highly scalable tools are likely to play a bigger part in employee wellbeing in the future, whether provided directly by companies like Unmind and Oliva, or as options as part of more general health insurance packages. For now, business leaders need to remain mindful of the limitations of AI support for mental health, and understand that more serious issues require a human touch.
ThinQ is the must-bookmark publication for the thinking investor.