Mind welcomes Online Safety Bill measures, but warns protection must be given to legitimate support groups
Mind welcomes details shared on the Online Safety Bill, which confirms the UK Government's intentions to tackle harmful content, which encourages suicide or self-harm on social media and other major internet platforms, but Mind also warns that tech companies must make sure any actions to remove this content do not infringe on legitimate and valuable peer support, which is an important source of support for many people experiencing a mental health problem.
Paul Spencer, Head of Health, Policy and Campaigns at Mind, said:
“Mind is pleased to see the details shared today on provisions within the Online Safety Bill, laying out a specific requirement for major online platforms to prevent harmful content on self-harm and suicide being shared. In December 2021, two-thirds of people with mental health problems told Mind that they wanted the UK government to make stricter rules to stop harmful content on the internet, which can often encourage or even enable people to hurt themselves with tragic consequences.
"However, the use of social media can also be an important support mechanism for those with experience of self-harm and suicide. Discussion groups and support networks provide a vital resource for people to share experiences in a way that allows them to receive and give support in turn, and can help people in their recovery. Mind urges the UK Government to work with social media platforms such as Meta and Twitter to make sure any mechanisms to stop harmful content are sophisticated enough to allow effective peer support while still tackling the sinister and harmful content that can encourage self-harm or suicide.
"Mind would also urge the UK Government to examine ways to address some of the most harmful content that often appears on smaller websites, outside the larger social media platforms."