Instagram, owned by Facebook, is one of the most popular social media platforms currently. It has proven to be an effective tool for users, especially teens, to stay in touch and share their social life with friends.
However, over the last couple of years, the platform has come under intense scrutiny due to the cases of suicide and self-harm. The first suicide case that alarmed the world about the potential threat that the platform poses was of a 14-year-old British teenager, Molly Russell.
In 2017, Russell took her life after viewing graphic content on Instagram.
In another similar case, a 16-year-old girl in Malaysia reportedly killed herself after she posted a poll on her Instagram account asking followers if she should die or not. In the poll, about 69% of the followers voted that she should, notes a report from The Guardian.
A primary reason for such cases is that the teens seriously view parameters such as likes and the number of comments that post gets, more as an indicator of their popularity in real life. Often those who lack in these parameters consider themselves a failure, resulting in them taking unusual actions, including suicides.
Following such cases, Instagram has also taken a slew of measures to tackle suicide issues and make the platform safer. In February this year, the Facebook-owned company announced to release “sensitivity screens” to block images of self-harm.
Moreover, the platform also aims to restrict such content by not recommending it in searches. It is believed that the British teenager Molly Russell viewed images of suicide and self-harm on Instagram before taking her life in 2017.
Instagram is also taking measures to reduce the negative effects of social media parameters discussed above. The platform is running tests in a few countries, where it hides like counts on posts. In the test, the like counts are hidden from public posts, which means other people won’t be able to view the like count.
Last month, the platform also expanded its ban on the graphical self-harm imagery.
The ban now includes more suicide-related content, such as the drawing of suicide methods, illustrations of self-harm (including cartoons and memes), and more, Adam Mosseri, Head of Instagram, said in a blog post last month.
We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods
Instagram is also in talks with academics and mental health organizations, such as the National Suicide Prevention Line in the US and the Samaritans in the UK, to come up with suitable measures.
Instagram, on its end, seems to be doing what it could. The proper implementation of these measures, however, is the key. Mosseri also acknowledges that it would take some time to fully implement the new measures, along with assuring that more measures are in the pipeline.
Despite such measures, it isn’t clear how much time will it take to purge the platform of such evils completely. To expedite and ensure the effectiveness of these measures, we can contribute to the cause as well by keeping an eye for suspicious posts and flagging them.
One perfect example of this sort of policing is a 22-year-old Norwegian girl Ingebjørg Blindheim. This young girl, who has earned the nickname “the lifeguard,” tracks ‘dark’ Instagram accounts to help suicidal Instagram users.
She has no formal training for such matters, nor does Instagram pay her for the services. Yet, she constantly tracks Instagram to help users in distress, as per BBC.
PiunikaWeb started as purely an investigative tech journalism website with main focus on ‘breaking’ or ‘exclusive’ news. In no time, our stories got picked up by the likes of Forbes, Foxnews, Gizmodo, TechCrunch, Engadget, The Verge, Macrumors, and many others. Want to know more about us? Head here.