The internet is rife with social media societies on many popular sites, including Twitter, Instagram, and Tumblr. #Ana, #Sue, #Cat… Translated, they mean Anorexia, Suicidal, and Self-Harm.
By searching these tags, online users are exposed to a slew of posts from people experiencing the mental illnesses or issues tagged. On Instagram you can find pictures of users with fresh cuts on their arms accompanied by the hashtag #Cat, indicating self-harm.
A study by Janis Whitlock, Jane Powers, and John Eckenrode at Cornell University, found that the self-injury related message boards studied were mostly frequented by females between the ages of 14 and 20.
With so many young people accessing social media, membership to these online communities can be concerning. In an interview with Vice, Frank Köhnlein, a youth psychiatrist at the University Clinic in Basel, discussed how young members are particularly vulnerable:
“Young people who are already fragile and perhaps already have experience with self-harm could be massively stimulated by this sort of thing and encouraged to self-harm again. When self-harm is glorified or—as in this case—put into an almost religious context, so that it is evaluated positively, the risk is particularly high.”
The Whitlock study showed that online interactions can reduce social isolation in adolescents since the exchanges allow teenagers to connect with others easily. In essence, social media can serve as a virtual support group where users gain instant help.
But negative implications may outweigh the positive. Whitlock also found that participating in self-harm message boards online can normalize and encourage self-harm, and teach vulnerable individuals tactics to conceal self-injurious behaviours. Users within these virtual sub-communities often exchange techniques for self-harming as well.
A study by Carla Zdanow and Bianca Wright at the Department of Journalism, Media and Philosophy at the Nelson Mandela Metropolitan University looked at user statements in two Emo Facebook groups. They found that cutting was discussed often, with teenagers openly expressing affirmative opinions of these behaviours.
Sites like Instagram are aware of the precarious environment their users have created. When searching for tags like #Sue, a “Content Advisory” warning comes up reading, “Please be advised: These posts may contain graphic content. For information and support with suicide or self-harm please tap on Learn More.” The notice displays a link to Befrienders Worldwide, a site that provides emotional support to prevent suicide. Users can then choose to view posts or navigate away from their original search.
Instagram outlines the types of photos and videos that are “appropriate” for posting in their Community Guidelines, specifically singling out eating disorders, and self-injury as not welcome in the community.
Megan Moreno at the Seattle Children’s Research Institute and colleagues conducted a study where they found 10 hashtags on Instagram related to non-suicidal self-injury (NSSI). A popular image outlining the code words for mental illnesses titled #MySecretFamily had over 1.5 million search results. Only one-third of the NSSI related hashtags generated content advisory warnings, which means that the majority of this NSSI content is easily accessible to all Instagram users, regardless of age or mental stability.
In an interview with A Stark Reality, a 15-year-old girl from Denmark, whose name was changed to protect her identity, described her relationship with this online community:
“The community means that I can express myself, and talk to people all over the world that feel the same way. Sometimes we cheer each other up, other times we drag each other down in that big, black hole called sadness.”
For better or worse, people reach out to online communities. Sites like Instagram and Twitter need to play a greater role in providing their users with mental health resources and accurate information.
– Abbiramy Sharvendiran, Contributing Writer