Facebook Is Becoming Strict to Protect the Mental Health of Users

Linked to the world suicide prevention day Facebook has decided to make some major changes that are going to make the platform mental health friendly. According toAntigone Davis, Global Head of Safety, Facebook is planning to design Instagram and its parent platform in a way that it will not support any search regarding mental health or issues in anyway. This is not related to the awareness of mental health but instead graphic images that can push people towards mental health issues. Some of the most common mental health issues that are pointed out include suicide, eating disorder, self-harm or anything that can encourage people to pursue anything damaging for themselves.

The decision was taken to raise awareness about the mental abnormalities because most of the people are blaming mental health issues on social media platforms. People think that because there is no private life everyone wants to live on social media and gain validation of the society. The process might seem easier but this converts everything into stress however, this is the root cause but the company has not addressed this.

Instead, Facebook ha decide to adopt a censor policy on graphic images that can highlight a mental health disorder, a typical example if suicide which means that even on searching, Facebook will decrease the availability of the images that connect the user to suicide. In case, there are images representing self-harm, Facebook will also censor them even if the scars are healing or have healed. Similarly, the prominent body parts that can convey the message of starvation will also be censored. This means that showing ribs, spine, concave belly, beauty bone or any bone that shows that by starving yourself you can look pretty, will be censored.


The main issue is that even after censoring the body parts and things that are highlight the issue, Facebook has little to do with the advertisements and the pills that promote dieting and the root cause of the depression and suicide. The only additional thing that seems to helps the person displaying such symptoms is the #chatsafe. To implement this idea in long run, Facebook has also hired a safety policy manger that will help in advising on these sensitive matters. Here is the list of help centers around the world if you need to find a suicide helpline for yourself or a friend.

Facebook is Tightening its Policies and Expanding Resources to Prevent Suicide and Self-Harm

Read next: Will iOS update limit Facebook's ability to track user's location information?
Previous Post Next Post