EuropeNews

Ofcom Calls for Social Media Algorithm Adjustments to Combat Misinformation

In the aftermath of the tragic Southport murders and subsequent riots that shook the nation this summer, Ofcom chief executive Melanie Dawes is urging social media companies to make critical adjustments to their algorithms to prevent the spread of misinformation. The call to action comes as investigations reveal the significant role online platforms played in amplifying divisive narratives and false claims about the attacker’s identity in the days following the incident.

Viral Misinformation Fuels Unrest

Despite efforts by tech firms to combat harmful content, posts containing unverified allegations and hateful rhetoric about the Southport attacker spread rapidly across social media, reaching millions of users. Some high-profile accounts falsely claimed the perpetrator was a Muslim asylum seeker, inciting racial and religious hatred.

Posts about the Southport incident and subsequent events from high-profile accounts reached millions of users, demonstrating the role that virality and algorithmic recommendations can play in driving divisive narratives in a crisis period.

– Melanie Dawes, Ofcom Chief Executive

The viral misinformation had real-world consequences, with some online groups targeting local mosques and asylum accommodation for demonstrations and potential violence. According to Dawes, “There was a clear connection between online activity and violent disorder seen on UK streets.”

Uneven Platform Responses

While some social media platforms took swift action to remove harmful posts and suspend accounts spreading misinformation, Dawes noted that responses were “uneven” across the industry. False claims about the attacker’s identity continued to circulate on certain platforms for days, even as evidence mounted of their inaccuracy and potential to stir up hatred.

The varying effectiveness of tech companies’ moderation efforts highlights the need for more consistent and robust processes to identify and remove illegal content, especially during high-stakes events like the Southport incident. Platforms must also be prepared to handle the sheer volume of problematic posts that can emerge in a crisis, with some reporting tens of thousands of violations.

Algorithmic Adjustments and Increased Accountability

To address these challenges, Dawes is calling on social media firms to make fundamental changes to their algorithms, particularly for content served to children and young people. Under Ofcom’s proposed online safety guidelines, platforms would be required to adjust their systems to downrank illegal or harmful content, reducing its visibility and virality.

  • Algorithms must prioritize safe, factual content over engagement metrics
  • Faster, more comprehensive removal of posts violating content policies
  • Robust crisis response protocols for swift action during high-risk events
  • Increased transparency and accountability measures for content moderation decisions

While the online safety bill has yet to come into force, Dawes emphasizes that tech companies should not wait to implement these critical changes. The Southport tragedy serves as a stark reminder of the real-world harm that can result from unchecked online misinformation and the urgent need for platforms to take proactive steps to safeguard their users.

Building Media Literacy and Public Awareness

Beyond algorithmic adjustments, Dawes also highlights the importance of promoting media literacy to help the public navigate the complexities of the online information landscape. Equipping individuals with the skills to critically evaluate sources, spot potential misinformation, and protect themselves and others from harmful content is a crucial component of fostering a safer digital environment.

These events have clearly highlighted questions tech firms will need to address as the duties come into force. While some told us they took action to limit the spread of illegal content, we have seen evidence that it nonetheless proliferated.

– Melanie Dawes, Ofcom Chief Executive

As the online safety bill moves closer to implementation, Ofcom is committed to working with social media companies, policymakers, and the public to establish clear expectations and enforcement mechanisms. The goal is to create an online environment that fosters open dialogue and free expression while minimizing the risk of harm from misinformation and illegal content.

The Southport incident serves as a sobering case study of the challenges that lie ahead, but also an opportunity for meaningful change. By adjusting algorithms, strengthening content moderation, and empowering users with media literacy skills, we can work towards a future where the immense potential of social media is not overshadowed by its darkest impulses.