Meta complies with child-protection rule in Indonesia, YouTube not yet
By Roffie Kuniawan ( April 10, 2026, 06:23 GMT | Insight) -- Meta Platforms has fulfilled its child-protection obligations under Indonesian government rules, the ministry of communication and digital affairs announced, following an examination conducted early this week. Google's video-sharing platform YouTube, however, has yet to comply and has been issued a warning letter.Meta Platforms has fulfilled its child-protection obligations under Indonesian government rules, the ministry of communication and digital affairs announced, following an examination conducted early this week. Google's video-sharing platform YouTube, however, has yet to comply and has been issued a warning letter."Today, we commend Meta, which houses Instagram, Facebook and Threads, for aligning its features and services with Indonesian law," Minister of Communication and Digital Affairs Meutya Hafid said Thursday.She made the statement after the digital ministry examined the two giant platforms on Tuesday to assess whether they have complied with the child-protection regulation.Major global social-media platforms operating in Indonesia have started to make adjustments to their platforms, with some still in the process of modifying platforms in consultation with the government after new child-safety rules for digital platforms took effect on March 28.Ministerial Regulation No. 9 of 2026 (see here), issued by the Ministry of Communication and Digital Affairs last month, sets out detailed technical and reporting obligations for electronic-system providers.It operationalizes Indonesia’s overarching child online-protection framework, Government Regulation No. 17 of 2025 (see here), known as PP Tunas, which was issued last year to establish the legal basis for age-based access restrictions and platform-compliance obligations.The regulation establishes new minimum-age requirements for online platforms, including a minimum age of 16 for high-risk platforms, unless a risk-based self-assessment process determines the platform is low-risk.Under the regulation, digital platforms have to conduct structured self-assessments to determine whether their products, services and features present high or low risk to children. They must then implement age-verification mechanisms and parental control tools aligned with their assigned risk profile.Meta on Thursday announced that it set a minimum age limit of 16 across its platforms and adjusted its community policies."We have verified this compliance. This demonstrates that the adjustment is not a technical issue, but rather a matter of the platform's commitment to protecting children and respecting national law," Meutya said.Although Meta has complied, full oversight remains in place. The digital ministry highlighted that implementation of child protection rules is subject to regular evaluations.Meta said that “keeping teens safe online is our top priority” and expressed its support for the Indonesian government’s new regulation....
Prepare for tomorrow’s regulatory change, today
MLex identifies risk to business wherever it emerges, with specialist reporters across the globe providing exclusive news and deep-dive analysis on the proposals, probes, enforcement actions and rulings that matter to your organization and clients, now and in the longer term.
Know what others in the room don’t, with features including:
Daily newsletters for Antitrust, M&A, Trade, Data Privacy & Security, Technology, AI and more
Custom alerts on specific filters including geographies, industries, topics and companies to suit your practice needs
Predictive analysis from expert journalists across North America, the UK and Europe, Latin America and Asia-Pacific
Curated case files bringing together news, analysis and source documents in a single timeline