This is the new MLex platform. Existing customers should continue to use the existing MLex platform until migrated.
For any queries, please contact Customer Services or your Account Manager.
Dismiss

Expand definition of CSAM to cover AI-generated deepfakes, Unicef urges

( February 4, 2026, 15:04 GMT | Official Statement) -- MLex Summary: Governments should expand the legal definitions of child sexual abuse material, also known as CSAM, to include AI-generated content, Unicef has urged in a statement. The UN agency for children called out governments, developers and tech companies to act against sexualized deepfakes. It said a new study across 11 countries found that 1.2 million children had their images turned into sexually explicit deepfakes in the past year.Statement follows....

Prepare for tomorrow’s regulatory change, today

MLex identifies risk to business wherever it emerges, with specialist reporters across the globe providing exclusive news and deep-dive analysis on the proposals, probes, enforcement actions and rulings that matter to your organization and clients, now and in the longer term.


Know what others in the room don’t, with features including:

  • Daily newsletters for Antitrust, M&A, Trade, Data Privacy & Security, Technology, AI and more
  • Custom alerts on specific filters including geographies, industries, topics and companies to suit your practice needs
  • Predictive analysis from expert journalists across North America, the UK and Europe, Latin America and Asia-Pacific
  • Curated case files bringing together news, analysis and source documents in a single timeline

Experience MLex today with a 14-day free trial.

Start Free Trial

Already a subscriber? Click here to login