Facebook’s Instagram records show a failure to deal with harmful posts

Facebook’s Instagram records show a failure to deal with harmful posts

Instagram removed almost 80% fewer posts which contained graphic content about self-harm or suicide between April and June this year than they did in the previous quarter.

Facebook, Instagram’s parent company, have released records which showed quarterly statistics about Instagram’s social media posts for the year so far.

The records show that the removal of harmful posts reached pre-Covid levels in the quarter between July and September.

Facebook has stated that most of its content moderators were sent home during the coronavirus pandemic. They claim that they prioritised the removal of the “most harmful content” during this time.

Children’s charity the NSPCC however, has said that the drop in the removal of harmful content meant that “young people who needed protection from damaging content were let down”. 

The statistics come amid a wider discussion in the past few years about the responsibility of social media firms to tackle posts which could be harmful to its users.

Several high-profile cases of self-harm and suicide by young people who were influenced by social media have been brought to the public’s attention recently and Facebook’s owner, Mark Zuckerberg, has given evidence in front of US congress regarding the monitoring of content employed by his social media firms.

Maurice Hawthorne, Medical Director of Iodem, commented: “As a parent and a clinician, I am acutely aware of the challenges that social media presents for the mental health of certain groups – such as young people.”

In January 2020, the Royal College of Psychiatrists (RCP) published a report titled: ‘Technology use and the mental health of children and young people’. 

The report called for research to be carried out into how social media content impacts the mental health of young people.

Mr. Hawthorne continued: “Despite concerns being highlighted by the RCP, and a call for much needed research to be conducted, social media companies are not responding.

“When self-regulation fails, it is time for legislative action to protect young people and the NHS, which is trying to support them.”