Instagram announces restrictions on content related to self-mutilation, self-injury, and posting it to social media.
Reported from ABC, this decision was taken after the incident of Molly Russell, a teenager in England who committed suicide in 2017. Molly's parents blamed social media, especially Instagram, when they found that victims followed accounts featuring depressive content, suicide, and independent. harm.
Adam Mosseri, Instagram executive leader said that changes to Instagram rules related to self-injuring content were carried out through discussions involving mental health experts, adolescents and suicide analysis.
"Over the past few months we have realized that we have not noticed content of self-harm and suicide, and we want to provide more protection to Instagram users." Mosseri said, through online posting, quoted from phys.org.
One way Intagram filters out self content is to attract non-graphic content (forbidden to display) from the search, hashtag, browse, and recommendations columns.
Mosseri again added that they did not draw all the content, because they did not want to stigmatize, or isolate, people who actually needed help. Instagram then attempts to offer counseling assistance, or other sources of assistance for accounts that see, or upload content that is harmful to themselves.
"During the discussion, experts, including mental health centers, as well as save.org, stressed that creating a safe place so patients will share their experiences is important."