Government social media age restrictions, such as the one coming into force this week in Australia, or broad content-moderation laws such as those in the EU and the UK, are impossible in the US because they are in fatal conflict with First Amendment rights to free speech. There are moves to edge around this constitutional obstacle, but one effect is that platforms are choosing to block access to all minors — or even to everyone in certain states.
Social-media platform Bluesky made a painful regulatory call this year: Rather than comply with a new children’s online safety law in the US state of Mississippi by blocking access to minors, it would instead block everyone in the state.The cost of verifying who is a child and who is an adult to satisfy the law’s requirement that platforms confirm parental consent was just too much for its small compliance team, Bluesky said.
“We believe effective child safety policies should be carefully tailored to address real harms without creating huge obstacles for smaller providers and resulting in negative consequences for free expression,” the platform explained.
Increasingly, governments around the world are moving toward stricter age limits. Rather than traditional content-moderation regulation that requires platforms to remove or block violent or sexual content, they ban children under a certain age.
The EU is moving toward establishing a “digital age of majority” that would set a legal age limit for access to social media across Europe (see here). A social media ban in Australia for under-16s takes effect Dec. 10 (see here). Indonesia this year adopted a ban for under-16s, although it won’t be enforced until 2027 (see here).
Snap told investors in November that it’s rushing to roll out an age-verification system across its entire Snapchat platform to identify users under 16 and under 13, to comply with age-related bans such as Australia’s (see here). More countries are likely to follow with similar rules, Snap said.
In the United States, many state lawmakers are also trying to raise a legal wall between minors and online content deemed unfit for them to see. Whether those laws can survive constitutional scrutiny, however, remains uncertain.
— First Amendment limits —
The US is unique in the world, making the protection of free expression from government control pre-eminent over all other rights. The First Amendment in the Constitution’s Bill of Rights directs that “Congress shall make no law … abridging the freedom of speech or of the press … .”
But with the US and other countries facing a youth mental-health crisis that the US Surgeon General in 2024 said is associated with social media use, legislative and regulatory efforts to protect children online are butting up against the First Amendment.
Bespoke age restrictions imposed by governments, such as in Australia, or broad content-moderation laws such as the EU’s Digital Services Act and the UK’s Online Safety Act, are impossible in the US because they are in fatal conflict with the First Amendment.
So instead of trying to regulate content, the US — and increasingly other jurisdictions around the world — are blocking minors’ access to social media altogether.
During the first three decades of the commercial Internet, the US Supreme Court and lower courts consistently found federal and state laws unconstitutional if they — beyond the most narrowly tailored exceptions — limited Americans’ access to words and images online. If the government’s attempts to protect the safety of minors had the secondary effect of burdening the free speech rights of adults, that law was generally ruled unconstitutional.
This year, however, that has started to change. A more conservative Supreme Court departed from its practice over the past 30 years to decisively endorse a lower standard of First Amendment scrutiny, allowing states and the federal government to limit what content Internet users under 18 can see.
One result is that the US could be moving toward an online landscape where entire platforms such as Bluesky or Nextdoor are barred to anyone under 18 or anyone living in certain states. The result could be a more limited, checkerboard version of Australia’s blanket ban. Complicating matters, though, bans in the US are being self-imposed by platforms that fear regulatory sanctions.
Some constitutional scholars say the US Supreme Court’s June decision in Free Speech Coalition v. Paxton, upholding the constitutionality of a Texas law that requires pornographic websites to use age-gating technology to block anyone under age 18, could make it easier for states to defend laws such as Mississippi’s that are prompting platforms such as Bluesky — a general-interest social media platform with content legal for minors to view — to wall off portions of the Internet from adults and minors alike.
The Free Speech Coalition decision “serves as an invitation for states to push the limits and see what they can get away with,” Ari Cohn, lead counsel for tech policy with the Foundation for Individual Rights and Expression, told MLex.
If states are already citing the Free Speech Coalition decision as they defend online age-gating laws against First Amendment challenges, that strategy is unlikely to succeed, believes Paul Taske, a lead lawyer for NetChoice, the Internet industry organization challenging many of the state age-gating laws.
Justice Clarence Thomas, who wrote the decision, was clear that the Supreme Court was “grappling with a unique circumstance” in dealing with content that was First Amendment-protected for adults, but not for minors, Taske said: “I really don't see this sort of contagion effect going anywhere.”
— Real effects —
Perhaps not, but a raft of new state laws intended to protect teens from the mental-health impacts of social media content increasingly require not just porn websites, but all social media, to segregate the experience of users under the age of 18 from adults. In states such as Mississippi, Tennessee and California, those laws apply to social-media platforms where content is lawful for minors to see, not just adult sites where content is illegal for minors to view.
While some constitutional scholars believe the Supreme Court will ultimately strike those laws as unconstitutional, no one knows for sure. One long-term effect of regulatory risk could be that Americans of different ages, living in different states, or using different social media platforms may increasingly experience a different Internet.
That is already happening. Nextdoor, a social network where people typically post pictures about their neighborhood garage sales, is now blocking under-18s from creating accounts in Mississippi and Tennessee, two states with strict age-verification laws, just as Nextdoor does in the UK in response to the Online Safety Act.
Bluesky and another smaller platform, Dreamwidth (see here), have gone even further. They are blocking access to all users — adults and children — from Mississippi after the Supreme Court in August allowed the state’s Walker Montgomery Protecting Children Online Act (HB 1126) to go into effect, despite a statement by Justice Brett Kavanaugh that the law is likely unconstitutional.
The smaller platforms say they cannot shoulder the compliance costs of having to create systems to verify the age of every user — no matter how old — and the regulatory risk of the potentially massive fines that could result if they were found to violate the Mississippi law, which carries a penalty of up to $10,000 per violation.
Regulatory risk of that amount “is an existential threat” for a smaller platform, Denise Paolucci, one of the owners of Dreamwidth, wrote in a post explaining its decision to block all users with Mississippi IP addresses.
The constitutionality of the Mississippi law, which requires platforms to “make commercially reasonable efforts to verify the age of the person creating an account,” and requires parental permission for minors to open a social media account, is being challenged by NetChoice in the US Court of Appeals for the Fifth Circuit.
“This is going to be probably the first major test of whether these attempts to broaden [the Supreme Court] Free Speech Coalition [decision] are going to hold,” Cohn told MLex.
The American Civil Liberties Union and other digital rights groups have filed briefs warning that the Mississippi law prevents both minors and adults from accessing speech that they have a constitutional right to access, blocks adults “when they cannot verify their age or are unwilling to do so,” and increases the risks of privacy invasions and data breaches.
In the Free Speech Coalition case, the Supreme Court’s conservative supermajority backed the Texas anti-porn law by a 6-3 margin. But Justice Elena Kagan, writing the dissent for the three liberal justices, said the majority erred by not applying the “strict scrutiny” standard to a law that limits content.
That standard demands that for a US court to decide that a law is constitutional, judges must determine that the law is as narrowly tailored as possible to achieve its public purpose — in this case, to prevent children from viewing pornography.
— Departure from ‘Strict Scrutiny’ —
In the earliest days of the commercial internet, the US Congress passed the Communications Decency Act, or CDA, which prohibited websites from transmitting “indecent” or “patently offensive” material to anyone under 18. Aiming not to violate the free-speech rights of adults, the CDA included an age-verification system through which adults could use a credit card number or a personal identification number to verify their age.
It didn’t work. By a decisive 7-2 margin in a 1997 decision called Reno v. ACLU, the Supreme Court struck down most of the CDA as violating the First Amendment.
“Although the government has an interest in protecting children from potentially harmful materials, the CDA pursues that interest by suppressing a large amount of speech that adults have a constitutional right to send and receive,” Justice John Paul Stevens wrote for the majority.
The only two Supreme Court judges who believed the CDA was constitutional in 1997 were Clarence Thomas and Antonin Scalia. Thomas remains on the court today, and three decades on, his view is now the majority.
With the Texas Free Speech Coalition case, the court confronted a gray area, because while the ability of children to view pornographic images is not protected by the First Amendment, the right of adults to view such content is.
Thomas, who wrote the majority opinion, acknowledged that while an age-verification law impacts the free-speech rights of both adults and minors, the impact on adults is “only incidental.” The Texas law could be analyzed by a less restrictive, “intermediate scrutiny” standard, Thomas concluded.
Adults “have no First Amendment right to avoid age verification, and the statute can readily be understood as an effort to restrict minors’ access,” Thomas wrote.
If the Fifth Circuit concludes that the burden on the free-speech rights of adults from the Mississippi law is “only incidental,” in line with the Supreme Court’s finding over the Texas pornography law, it could follow Thomas’ precedent of analyzing the law under the intermediate scrutiny standard.
Whether the high court will uphold other laws that have an incidental impact on the free-speech rights of both adults and minors, as in Mississippi, remains to be seen.
With reporting by Sara Brandstätter in Brussels, Patricia Figueiredo in London, and Maria Dinzeo in San Francisco.
Please email editors@mlex.com to contact the editorial staff regarding this story, or to submit the names of lawyers and advisers.