Credit: cottonbro studio from Pexels
From “try yoga” to “start journaling,” most mental health advice piles on extra tasks. Rarely does it tell you to stop doing something harmful.
New research from the University of Bath and University of Hong Kong shows that this “additive advice bias” appears everywhere: in conversations between people, posts on social media, and even recommendations from AI chatbots. The result? Well-intentioned tips that may leave people feeling more overwhelmed than helped.
With mental health problems rising worldwide and services under strain, friends, family, online communities and AI are often the first port of call. Understanding how we advise each other could be key to making that support more effective.
A collection of eight studies involving hundreds of participants, published in Communications Psychology, analyzed experimental data, real-world Reddit advice, and tested ChatGPT’s responses. Participants advised strangers, friends, and themselves on scenarios involving harmful habits like gambling, and missing beneficial activities such as exercise.
Key findings:
- Additive advice dominates: Across every context, people suggested adding activities far more than removing harmful activities.
- Feasibility and benefit: Doing more was seen to be easier and more beneficial than cutting harmful things out.
- Advice varies by relationship: Cutting harmful things out is viewed as easier for our close friends than for ourselves.
- AI mirrors human bias: ChatGPT gave predominantly additive advice, reflecting patterns in online social media.
Senior author Dr. Tom Barry, from the Department of Psychology at the University of Bath, England said, “In theory, good advice should balance doing more with doing less. But we found a consistent tilt towards piling more onto people’s plates and even AI has learned to do it. While well-meaning, it can unintentionally make mental health feel like an endless list of chores.”
Co-author, Dr. Nadia Adelina from the Department of Psychology at the University of Hong Kong, said, “As AI chatbots become a major source of mental health guidance, they risk amplifying this bias. Building in prompts to explore what people might remove from their lives could make advice more balanced and less overwhelming.”
More information:
Tom J. Barry et al, People overlook subtractive solutions to mental health problems, Communications Psychology (2025). DOI: 10.1038/s44271-025-00312-8
Provided by
University of Bath
Citation:
Why mental health advice often adds to your to-do list (2025, August 20)
retrieved 21 August 2025
from https://medicalxpress.com/news/2025-08-mental-health-advice.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.