This is the new MLex platform. Existing customers should continue to use the existing MLex platform until migrated.
For any queries, please contact Customer Services or your Account Manager.
Dismiss

Age-check tools shouldn't collect reams of minors' data, Irish watchdog says

By Matthew Newman

April 24, 2025, 20:32 GMT | Insight
Age verification technologies can help companies ensure that minors stay off their platforms, but these tools shouldn’t collect vast amounts of children’s data to verify their ages, a commissioner at Ireland’s privacy watchdog said today. Age assurance methods — which include age verification, age estimation or a combination of both — are currently under scrutiny by regulators in the EU, UK and US.
Age verification technologies can help companies ensure that minors stay off their platforms, but these tools shouldn’t collect vast amounts of children’s data to verify their ages, a commissioner at Ireland’s privacy watchdog said today.

“It's not a sustainable position for companies to have to do their own age verification by gathering extreme amounts of data to verify ages,” Irish Data Protection Commissioner Dale Sunderland said at a conference* today. For example, it wouldn’t be “appropriate” for social media platforms to ask users for their passport data to verify age, he said.

He said that the EU’s effort to develop technologies, such as an interim age-verification app, could help companies comply with the EU’s General Data Protection Regulation and requirements in the Digital Services Act to keep harmful content from minors.

The European Commission is working with the bloc’s member states and some third-party companies on an app for age verification to be presented late this year (see here). This temporary solution should pave the way for the EU digital wallet, which is expected to be in place by the end of 2026.

The Irish DPC and the European Data Protection Board, the umbrella group of EU data protection authorities, are working with the commission to develop the app, he said.

“Where we want to get to is a place where there's less data intrusive, more privacy, enhanced solutions that can work for everybody,” Sunderland said.

“I think there are solutions which provide for appropriate age assurance and age verification using the minimum amount or very little personal data,” he said.

Age assurance methods — which include age verification, age estimation or a combination of both — are currently under scrutiny by regulators in the EU and in the UK.

In the EU, the question of responsibility is yet to be determined: Meta Platforms, the owner of Instagram, Facebook and WhatsApp, wants age verification on the operating system level, while Apple and Google, which operate the world’s most accessed app stores, argue for verification to take place at each platform or app (see here).

Sunderland said the discussion over technology is important, but the fundamental issue is that it ensures that “data protection works for children, facilitates innovation, but in a way that provides all the protections for children online,” which is the “ultimate objective.

Sunderland said that he’s concerned that AI is used as a “shortcut” for a “proper assessment” of whether a platform is preventing minors from accessing its services.

“Commercial pressures mean that things will be done fast without the thoughtful consideration being given to the risks and harms that might be exposed to children,” he said.

Sunderland’s comments come as US authorities step up their efforts to protect children online. Children’s online data will be harder to sell to third parties beginning this summer under the US Federal Trade Commission’s new Children's Online Privacy Protection Act rule that will go into effect in June (see here)

Stacy Feuer, senior vice president at the Entertainment Software Rating Board, said companies are challenged with ensuring that the tools they use for kids' privacy work.

“We've seen in some of the cases in the kids' space, but also elsewhere, that regulators are really concerned about tools that don't work,” she said.

Sutherland said the Irish watchdog has an enforcement record on children’s privacy. In 2023, TikTok was fined 345 million euros ($370 million) after a two-year investigation into its privacy settings, age verification process and transparency measures (see here).

In 2022, Instagram was ordered to pay 405 million euros (around $461 million) for a GDPR violation. The case concerned Instagram’s former policy of having children’s profiles public by default, and mandating that children who operate business Instagram accounts must provide public contact details (see here).

* IAPP Global Privacy Summit 2025, Washington, DC, April 22-24, 2025.

Please email editors@mlex.com to contact the editorial staff regarding this story, or to submit the names of lawyers and advisers.

Tags