This is the new MLex platform. Existing customers should continue to use the existing MLex platform until migrated.
For any queries, please contact Customer Services or your Account Manager.
Dismiss

OpenAI’s 'hallucination' warnings helped sink radio host’s US defamation claims

By Emma Whitford ( May 20, 2025, 19:24 GMT | Insight) -- False information about a conservative Georgia radio host "hallucinated" by ChatGPT could not be reasonably accepted as fact, a state judge ordered yesterday, saving OpenAI from Mark Walters’ 2023 defamation claims. OpenAI’s disclaimers about the possibility of inaccurate outputs helped its case, according to the order on summary judgement.False information about a conservative Georgia radio host, generated as "hallucinations" by ChatGPT, could not be reasonably accepted as fact, a state judge ordered yesterday, saving OpenAI from Mark Walters’ 2023 defamation claims....

Prepare for tomorrow’s regulatory change, today

MLex identifies risk to business wherever it emerges, with specialist reporters across the globe providing exclusive news and deep-dive analysis on the proposals, probes, enforcement actions and rulings that matter to your organization and clients, now and in the longer term.


Know what others in the room don’t, with features including:

  • Daily newsletters for Antitrust, M&A, Trade, Data Privacy & Security, Technology, AI and more
  • Custom alerts on specific filters including geographies, industries, topics and companies to suit your practice needs
  • Predictive analysis from expert journalists across North America, the UK and Europe, Latin America and Asia-Pacific
  • Curated case files bringing together news, analysis and source documents in a single timeline

Experience MLex today with a 14-day free trial.

Start Free Trial

Already a subscriber? Click here to login