Escaping the Echo Chamber: How to Build Safe Peer-to-Peer Mental Health Platforms
Across the globe, the average person will spend an estimated 3.4 million minutes on social media, or about six years of their life.1 In an era of digitalization, the barrier between our digital and personal lives is fading.
As social media becomes an increasingly important vehicle for connection, there has been a proliferation of online mental health support groups: peer-to-peer communities where suffering individuals seek out support, information, and solidarity.2
These communities offer exciting benefits for the mental health space: ease of access, anonymity, and a trusted network of empathetic peers.2 As more people turn to online platforms, it’s clear that the future of mental healthcare is digital.
But despite its promising start, recent evidence is starting to suggest that these communities may occasionally cause harm to their users.
Behavioral Science, Democratized
We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices.
At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.
From normalization to echo chamber
The main issue with peer-to-peer mental health communities is that they can turn into echo chambers – environments where beliefs exist unchallenged and are reinforced by the group, regardless of validity or accuracy.
Misdiagnosis, medical misinformation and spiralling mental health conditions can result from over-engagement with echo chambers.3 Paradoxically, it is the same factors that make peer-to-peer online communities such an effective mental health tool –ease of access, normalization, and trusted peers– that occasionally make way for destructive behavior patterns.
A main appeal of these communities is that mental illness is destigmatized and normalized.2 This is a beneficial form of support, but some researchers argue that this normalization could lead otherwise healthy individuals to inaccurately self-diagnose as mentally ill.3,6
To an untrained eye, the barometers of what constitutes mental illnesses are vague. As these platforms often lack professional input, healthy individuals could accidentally infer that their normal emotional fluctuations are actually manifestations of mental illness. As having a mental illness is normalized on these platforms, they receive abundant support for this belief from the group, reinforcing it further.3,6
Case study: TikTok tics
Over the COVID-19 pandemic, the social media platform TikTok exploded in popularity. At the same time, doctors in the United States observed an unprecedented rise in Tourette’s symptoms in young individuals.4 When they assessed these teens, they found two things in common:
- Their tics lacked several markers of Tourette’s, such as the predominance of facial movements. In some cases, their reactions were even more severe.
- The patients had high levels of interaction with popular TikTok creators who discussed their experiences with Tourettes.
The group of doctors concluded that the TikTok tics showed markers of a “mass sociogenic illness”, a phenomenon where illness symptoms spread without actual infection.5 Indeed, the collective agreed that interactions with these online communities on TikTok was critical in the development of these mystery tics.5
The other challenge of echo-chambers: romanticisation
Mental health misdiagnoses can put undue stress on our healthcare system, as individuals use vital resources typically designated to those clinically defined as mentally ill. However, misdiagnoses may only be the tip of the iceberg of the echo chamber problem.
Content moderation of these peer-to-peer mental health platforms is inadequate, allowing for the spread of misinformation about mental illness. Due to echo chamber aspects, negative feedback loops can emerge where users begin to validate each others negative thought patterns, further reinforcing unhealthy thoughts and behaviors.6
A prominent example of harmful echo chamber romanticisation is the ‘pro-ana’ phenomenon: an online Tumblr, Pinterest, and Tiktok community that frames itself as a ‘support group’ for those with anorexia, but instead romantizes and promotes eating disorders.7 By masking eating-disorder habits in the language of purity, beauty, and ‘thinspiration’, these groups promote behaviors that can lead to serious illness, and even death.
In order to understand how to fix these negative effects, we need to understand why we seek out echo chambers in the first place.
Why we build online echo chambers: Confirmation bias
Confirmation bias, explained:
We tend to base our identity on a few core beliefs about ourselves and the world. When we gather and look for information, we seek out data and narratives that affirm our beliefs in order to preserve our sense of identity. This tendency to notice, seek out, and interpret information in a way that affirms our existing beliefs is called confirmation bias.
Putting on the blinders: why it’s so hard to leave echo chambers
When we go online, this tendency is no different, but it’s magnified by machine learning algorithms. In order to personalize content to the user, social media algorithms feed us more of what we interact with, which siphons us into groups that affirm our beliefs.5
Since we tend to avoid information that counters our beliefs, these echo chambers become very difficult to leave.6 This leaves us exposed to the toxic behavior patterns that emerge from groupthink.
In the context of mental health, this can be highly dangerous: if negative thought patterns are prevalent in the community, we have a very human tendency to adopt them.
Intercepting confirmation bias: a potential solution
Confirmation bias usually occurs early in the decision-making process: when we first go to these online forums, we are only entering with a suspicion about our own mental well-being, not a confirmed belief. If we can intercept the user at this point in the decision-making process, they can be insulated from the appealing pull of the echo chamber.
The Innoculation Method: Vaccinating against misinformation
Over the COVID-19 pandemic, we have learned two things:
- Viruses are dangerous enemies: They spread quickly, target vulnerable people, and can have devastating consequences.
- Vaccines are a powerful defense: By injecting a weakened version of the virus into the body, the body can detect the threat, develop antibodies, and fight off the real virus more effectively.
Information scientists argue that online misinformation works like a virus: it passes from one user to another, multiplies rapidly, and once it has infected enough people, it becomes difficult to control. Like in medicine, however, we can prevent the proliferation of misinformation and echo chambers by using a psychological ‘vaccine’.8
By exposing users to a weakened version of an online mental health echo chamber, which stimulates and increases the salience of potentially toxic traits, users can be primed to recognize them and fight off the tempting pull of confirmation bias.
How does innoculation work in practice?
Imagine you open up your preferred social media. Before you enter a mental health community, a small game pops up. It displays three examples of messages you could send to the group, each varying in intensity. Your goal is to pick the message that maximizes your “echo chamber score” - the message that feeds into the group’s biases as much as possible. When your echo-chamber score reaches a certain threshold, you are shown the consequences of your actions on others – misdiagnosis, isolation, and declining mental health.
Similar gamified awareness tools have been shown to be effective in preventing users from falling down political echo chambers.8 Additionally, research has shown that interfaces that are optimized to display alternative political viewpoints were highly effective at preventing echo chamber formation.8 Clearly, increasing the salience and availability of alternative resources, viewpoints, and communities is key in preventing the tempting pull of the confirmation bias.
Innoculating mental health echo chambers
In the mental health space, instead of providing an alternative political viewpoint, a designer can organize a webpage to display professional mental health resources alongside mental health forums, with built-in nudges that increase likelihood of adoption of these resources. In this context, users can safely access the benefits of peer-to-peer platforms, while seeing that professional and academic resources, viewpoints, and help is available to them.
If the inoculation process and behavioral design is leveraged correctly, peer-to-peer platforms can retain the beneficial aspects of their platforms and protect their users, guiding them towards more moderated information sources.
Behavioral Science: The antidote to mental health misinformation
As more people turn to online mental health communities for solidarity and care, we must adapt. Using behavioral science, we can transform peer-to-peer platforms to avoid the dangers of echo chambers and the proliferation of medical misinformation. Through a psychological inoculation program which increases the salience of the confirmation bias and behavioral design that nudges users towards more proven solutions, we can take advantage of all that peer-to-peer mental health care has to offer.
The Decision Lab is a behavioral consultancy that uses science to advance social good. As the global mental health crisis evolves, developing safe peer-to-peer mental health resources is critical. We’ve long been committed to building empathic, scalable mental health solutions., demonstrated in our work with Wellness Together Canada, we leveraged behavioral design to increase engagement with professional online mental health resources for 400,000+ users. In a global mental health crisis, we understand that innovative solutions are needed.. If you are interested in building a healthier future together, contact us.
References
- Broadband Search. (2022). Average time spent daily on social media. BroadbandSearch.net. Retrieved June 27, 2022, from https://www.broadbandsearch.net/blog/average-daily-time-on-social-media#:~:text=In%202019%2C%20the%20WHO%20estimated,media%20over%20their%20whole%20lifetime.
- Prescott, J., Hanley, T., & Ujhelyi, K. (2017). Peer communication in online mental health forums for young people: Directional and nondirectional support. JMIR Mental Health, 4(3). https://doi.org/10.2196/mental.6921
- Shrestha, A. (2018). Echo: the Romanticization of Mental Illness on Tumblr . The Undergraduate Research Journal of Psychology at UCLA, 5, 69–80. Retrieved 2022, from https://urjp.psych.ucla.edu/wp-content/uploads/sites/76/2018/06/URJP_2018.pdf. x
- Jargon, J. (2021, October 19). Teen Girls Are Developing Tics. Doctors Say TikTok Could Be a Factor. The Wall Street Journal. Retrieved July 4, 2022, from https://www.wsj.com/articles/teen-girls-are-developing-tics-doctors-say-tiktok-could-be-a-factor-11634389201.
- Olvera, C., Stebbins, G. T., Goetz, C. G., & Kompoliti, K. (2021). TikTok tics: A pandemic within a pandemic. Movement Disorders Clinical Practice, 8(8), 1200–1205. https://doi.org/10.1002/mdc3.13316
- Bind, A. S. (2013, October 28). Social Media Is Redefining 'Depression'. The Atlantic. Retrieved July 4, 2022, from https://www.theatlantic.com/health/archive/2013/10/social-media-is-redefining-depression/280818/.
- Gerrard, Y. (2020, March 9). TikTok Has a Pro-Anorexia Problem. Wired. Retrieved July 4, 2022, from https://www.wired.com/story/opinion-tiktok-has-a-pro-anorexia-problem/.
- Jeon, Y., Kim, B., Xiong, A., Lee, D., & Han, K. (2021). Chamberbreaker: Mitigating the Echo Chamber effect and supporting information hygiene through a gamified inoculation system. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2), 1–26. https://doi.org/10.1145/3479859
About the Authors
Triumph Kerins
Triumph is passionate about understanding how human behavior influences our world. Whether it be global macroeconomics or neural networks, he is fascinated by how complex systems work, as well as how our own behavior can help create, sustain, and break these systems. He is currently pursuing a Bachelor’s degree in Economics and Psychology at McGill University, attempting to design an interdisciplinary approach to better understand all the quirks that make us human. He has experience in non-profit consulting, journalism, and research. Outside of work, you can find Triumph playing bass guitar, gardening, or down at a local basketball court.
Sekoul Krastev
Sekoul is a Co-Founder and Managing Director at The Decision Lab. A decision scientist with an MSc in Decision Neuroscience from McGill University, Sekoul’s work has been featured in peer-reviewed journals and has been presented at conferences around the world. Sekoul previously advised management on innovation and engagement strategy at The Boston Consulting Group as well as on online media strategy at Google. He has a deep interest in the applications of behavioral science to new technology and has published on these topics in places such as the Huffington Post and Strategy & Business.