Report Harmful Content Sees 31% Rise in Reports Over 2023

Report Harmful Content Sees 31% Rise in Reports Over 2023

Report Harmful Content has published its latest data from 2023, giving new insights into how the service supported people across the United Kingdom to report harmful content online, whilst demonstrating the extent of harmful content individuals viewed from January to December 2023.

Reports to the service increased by 31% during 2023, totalling 5,101 reports compared to 3,884 in 2022. Alongside this, the latest findings reveal the breadth of the service’s support and the importance of a national alternative dispute service. The analysis reveals that of the cases which were in remit and were escalated to social media platforms by Report Harmful Content, 89% were successfully removed by these platforms. 

View the Report Harmful Content 2023 Report

Key Trends

Animal abuse, image abuse and allegations of abuse were some of the top trends found in cases during 2023. Report Harmful Content revealed that 36% of cases reported to the service between November and December 2023 included animal abuse and the service was successfully able to remove 84% of that content.

Concerning trends involving the harassment of women from diverse cultural backgrounds were also revealed, where individuals were being maliciously impersonated on social media by ex-partners, friends, or family members. Last year Report Harmful Content successfully removed 253 profiles designed to harass and degrade these individuals across social media platforms.

Harmful Content on Online Platforms

Whilst Report Harmful Content acts as a trusted flagger, mediating between users and 25 different industry platforms, the latest findings demonstrated that many reports were made against a range of online services, with major social media platforms only making up a minority of the total reports.

In 2023, nearly half of the reports to the service encompassing violent, pornographic and self-harm/suicide-related content were found on smaller, independent platforms. The data reveals the need for a more inclusive regulatory framework and an independent single point of contact service, rather than assuming that signposting to major platforms is all that is required.

Shaping the Online Safety Act

2023 was a significant year for Report Harmful Content and the UK Safer Internet Centre, with the Don’t Steal My Right to Appeal campaign highlighting the importance of an alternative dispute resolution process within the Act.

As a result of the campaign, it was decided that Ofcom would report on their ongoing enforcement of harmful online content and would review whether an alternative dispute resolution was needed after a period of two years. Find out more about the Don't Steal My Right to Appeal campaign.

Report Harmful Content

Report Harmful Content is a national alternative dispute resolution service provided by the UK Safer Internet Centre and operated by SWGfL, empowering anyone over the age of 13 and living in the UK to report harmful content online.

To find out how to make a report to social media platforms, or to access further advice and support, anyone over the age of 13 and living in the UK can contact Report Harmful Content. The service is open Monday to Friday, 10 am to 4 pm, and reports to the service can be made 24/7. A helpline practitioner will review and respond to the report within 72 hours.

Visit Report Harmful Content

Back to Magazine

Related Articles