Cosmetic filters that alter a person's appearance have been prominent in an addictive-design trial against Meta Platforms in Los Angeles, but a new decision by a federal judge in Illinois that the state's biometric privacy law applies to Meta’s capture of facial geometries to create facial filters for its Messenger and Messenger Kids apps shows there also could be a privacy exposure for companies from the use of such filters.
Facial filters are a key part of California's high-stakes youth online safety trial against Meta Platforms in Los Angeles, but a new ruling by a federal judge in Illinois shows that filters can pose legal risks for social media platforms over privacy as well as on youth safety.Interactive filters that alter a person’s appearance to make them more attractive or younger, to create fantasy transformations into a cat or a dragon, or to convert them to a different gender have been a fixture of social media for more than a decade, since they debuted on Snapchat in 2015.
In ongoing trials in Los Angeles and in Santa Fe, New Mexico, addiction experts testified that filters can cause body dysmorphia and other mental health problems in younger users. But a suit filed against Meta in 2023 made a different legal claim. Filters, in this case embedded in Meta’s Messenger and Messenger Kids apps, allegedly violated an Illinois biometric privacy law by capturing the facial geometries of users without first getting their consent (see here).
The case filed in federal court in East St. Louis, Ill., involves the OG of state privacy laws: the 2008 Illinois Biometric Information Privacy Act. BIPA has forced a long list of tech companies, including Meta (see here), Google (see here), TikTok (see here) and Clearview AI (see here), to pay large privacy settlements in class-action cases.
A ruling Friday by US District Judge Nancy Rosenstengel (see here) denied Meta’s motion for summary judgment on its claim that the California Consumer Privacy Act, not BIPA, should govern the law of the case. The decision puts Meta at risk of another costly biometric privacy outcome in a class-action case.
Rosenstengel said that because BIPA provided superior privacy protections to CCPA, particularly for Illinois residents such as named plaintiffs Rebecca Hartman and Joseph Turner and Hartman’s minor children “R.H.” and “E.T.”, the claims against Meta could not be decided under the newer California privacy law.
While Rosenstengel agreed with Meta that “California has an interest in having its laws applied to companies that call California home,” she concluded California’s interest was outweighed by Illinois’ interest in protecting its residents’ privacy through BIPA, even though the CCPA includes heightened protections for biometric data.
“Illinois has a materially greater interest in this litigation than California because the application of California law would result in the evisceration of one of the state’s critical pieces of privacy legislation. And this is so, even though Meta is domiciled California and Plaintiffs occasionally used Meta’s products outside of Illinois,” Rosenstengel wrote. “In sum, the Court declines to apply California law in this case because doing so would violate a fundamental public policy of Illinois and Illinois has a materially greater interest in this litigation than California. Accordingly, Illinois law will govern the case going forward.”
Notably, Rosenstengel observed, Illinois lawmakers included a private right of action in BIPA, while California lawmakers added only a very limited private right of action in CCPA — one that applies only to data breaches, and that leaves most enforcement to the California attorney general or the California Privacy Protection Agency, not to individual consumers.
Because the plaintiffs’ claims were entirely based on BIPA, a finding that the CCPA applied would have decided the case in favor of Meta, “because the application of California’s substantive law necessarily forecloses claims under BIPA,” the judge noted in a footnote.
Rosenstengel previously denied Meta’s motion to dismiss the claims in 2024, finding that the Illinois BIPA was not preempted by the US Children’s Online Privacy Protection Act (see here). But she deferred ruling on Meta’s challenge to the choice of Illinois law over California law, allowing some discovery to take place as the basis for Friday’s summary judgment decision.
In the ongoing social media trials in Los Angeles and Santa Fe that began in early February, testimony about the facial filters used on Instagram (see here) focused on claims about how they contributed to the addiction of minors to the platforms, not on the privacy harm caused by the collection of biometric data to build the filters.
In the Southern District of Illinois case, however, the focus is on the alleged illegal collection and use of the defendants’ biometric data, not how use of filters may have affected mental health. The plaintiffs said in their summary judgement briefing (see here) that Meta disabled Messenger’s augmented reality filters in 2022 in Illinois and Texas, two states with biometric privacy laws.
Prior to that, Meta “used face geometries to model users’ faces and track the users' expressions in real time, which is often referred to as 'intelligent recognition'," the 2023 complaint by Hartman and Turner said.
Meta's Messenger and Messenger Kids apps, the plaintiff said, both “used scans of face geometry to identify individuals' location, expressions, and movements, thereby collecting and possessing biometric information locally on the operating device as well as collecting and possessing biometric information on Defendant's servers.”
As the case moves forward in East St. Louis, Meta faces a deadline this Friday to file its second amended answer and affirmative defenses to Rosenstengel.
Please email editors@mlex.com to contact the editorial staff regarding this story, or to submit the names of lawyers and advisers.