This is the new MLex platform. Existing customers should continue to use the existing MLex platform until migrated.
For any queries, please contact Customer Services or your Account Manager.
Dismiss

Online platforms must improve CSAM protections, UK online safety regulator says

By Patricia Figueiredo ( March 18, 2026, 17:16 GMT | Insight) -- Mainstream social media, search engine and pornography platforms must strengthen protections against child sexual abuse material or face enforcement action, a senior Ofcom official warned, as new regulator-commissioned research found their use is widespread amongst CSAM perpetrators. The findings show exposure often begins at a young age and can occur without active searching. The report also found emerging risks from AI-generated content.Mainstream social media services, search engines and pornography platforms are widely used by perpetrators to access child sexual abuse material, and they must strengthen their measures against this kind of illegal content or face regulatory action, a senior official at the UK online safety watchdog said Wednesday....

Prepare for tomorrow’s regulatory change, today

MLex identifies risk to business wherever it emerges, with specialist reporters across the globe providing exclusive news and deep-dive analysis on the proposals, probes, enforcement actions and rulings that matter to your organization and clients, now and in the longer term.


Know what others in the room don’t, with features including:

  • Daily newsletters for Antitrust, M&A, Trade, Data Privacy & Security, Technology, AI and more
  • Custom alerts on specific filters including geographies, industries, topics and companies to suit your practice needs
  • Predictive analysis from expert journalists across North America, the UK and Europe, Latin America and Asia-Pacific
  • Curated case files bringing together news, analysis and source documents in a single timeline

Experience MLex today with a 14-day free trial.

Start Free Trial

Already a subscriber? Click here to login