
AI in mental healthcare can be useful but risky
Using artificial intelligence in health care and mental wellbeing can be positive but greater transparency is needed. AI can translate complex medical information into patient friendly language. Used for mental healthcare, AI companions may offer temporary relief but could deepen feelings of isolation if they replace human-to-human interactions, according to a report published by The Leverhulme Centre for the Future of Intelligence (CFI) co-authored by CFI researchers Dr Tomasz Hollanek and Dr Aisha Sobey.
The report is based on results from a workshop held at Jesus College, University of Cambridge.
With AI-driven chatbots, mental health bots and AI-powered companionship devices increasingly used as solutions for loneliness, grief, and patient support, the report emphasises urgent need for ethical guidelines and regulation.
Key findings:
- AI in healthcare: Social AI could improve patient communication, provide 24/7 monitoring, and reduce the strain on overburdened healthcare systems. However, risks include AI misinformation, biased decision-making, and the displacement of human healthcare roles.
- AI and mental wellbeing: AI companions might offer immediate support for loneliness and grief, yet concerns arise over user dependency, the replacement of human connections, and the ethical implications of digital replicas of deceased loved ones.
- Policy recommendations: The report calls for greater transparency in AI design and stronger consent frameworks, as well as better accountability mechanisms, including systems for lodging and processing user complaints.
“AI companions are often marketed as solutions to deep-seated societal issues like loneliness and limited healthcare access. While they hold promise, we must critically assess their limitations- otherwise, we risk exacerbating the very problems they claim to solve”, says Dr Tomasz Hollanek.
Dr Aisha Sobey stresses the need for careful regulation: “Without clear guidelines, AI companions could blur ethical boundaries in patient care, data privacy, and emotional well-being. Our findings highlight the urgent need for safeguards to ensure these systems support, rather than manipulate, users- particularly vulnerable groups.”
The report says that ‘Social AI’ shows promise in addressing issues like overburdened healthcare systems and limited access to mental health care.
“While promising, Social AI also raises significant ethical and practical concerns, including the potential for user manipulation or the unintended development of dependencies, potentially exacerbating the issues these systems aim to solve.”
“AI systems may provide incorrect or harmful advice, such as inaccurate dosages or prognoses, due to technical issues like hallucinations or biased algorithms.
“Poorly designed AI interactions could cause psychological harm, frustration, and mistrust in medical systems. AI may also unintentionally hinder access to human care by acting as a barrier rather than a facilitator.
“Data bias and misuse are significant concerns. Additionally, Social AI poses unique risks, as the trust placed in medical professionals could be exploited if sensitive health-related data collected by these systems is mishandled or used for surveillance purposes.”
Among positive aspects of AI in healthcare is mentioned that chatbots could provide round-the-clock reassurance and support, answering repetitive questions without burdening healthcare staff.
“Systems could tailor communication styles to individual patient preferences, improving engagement and comfort.”
“AI systems could monitor patient conditions over time, providing timely escalation or support while waiting for in-person consultations.”
“AI systems could alleviate demand pressures on healthcare systems by handling informational tasks and could complement healthcare professionals, freeing up time for doctors to focus on personalised patient care.”
Moonshot News is an independent European news website for all IT, Media and Advertising professionals, powered by women and with a focus on driving the narrative for diversity, inclusion and gender equality in the industry.
Our mission is to provide top and unbiased information for all professionals and to make sure that women get their fair share of voice in the news and in the spotlight!
We produce original content, news articles, a curated calendar of industry events and a database of women IT, Media and Advertising associations.