Facebook spent a combined $15 billion on stock buybacks in 2017 and 2018 while Chief Executive Mark Zuckerberg deliberately chose to underfund teams tasked with trust and safety work on Instagram, the US Federal Trade Commission said during antitrust trial proceedings today in Washington, DC, federal court.
Facebook spent a combined $15 billion on stock buybacks in 2017 and 2018 while Chief Executive Mark Zuckerberg deliberately chose to underfund teams tasked with trust and safety work on Instagram, the US Federal Trade Commission said during antitrust trial proceedings today in Washington, DC, federal court.Roughly two hours into trial testimony from Meta Platforms Chief Information Security Officer Guy Rosen, the federal judge overseeing the FTC’s monopolization litigation nudged agency attorney Susan Musser to move things along.
The FTC had “spent a lot of time” and delved into an “awful lot” of details to try and drive home the point that Meta did not devote sufficient resources to support Instagram, and thus the photo-sharing app would have been better off absent Facebook’s $1 billion acquisition in 2012, US District Judge James Boasberg said.
Those details included an internal 2019 risk assessment documenting that Instagram’s algorithms were recommending millions of minors follow accounts identified as “groomers” and “27% of all follow recommendations to groomers were minors.”
“I don’t think Mark understands the urgency of working on integrity related issues at IG,” Instagram co-founder Kevin Systrom said in an internal 2017 e-mail referenced by the FTC during today’s proceedings.
Systrom asked colleagues for “anecdotes and links” to supplement his lobbying effort. “I’m assuming the child killing himself on [Instagram Live] is an important one, but I think there were others (a suicide?),” he wrote.
Musser wrapped up her direct examination of Rosen shortly after Boasberg’s chiding, but not before contrasting Facebook’s stock buybacks in 2017 and 2018 against internal documents in 2019 where Rosen said Zuckerberg made a “deliberate” decision to deny additional resources to Instagram’s integrity team because the CEO “thought IG has another year or two to catch up."
“In 2017 and 2018 combined, Meta bought $15 billion in stock instead of reinvesting that money into integrity. Is that right?” Musser asked.
Rosen, who took the helm of Meta’s centralized integrity team in 2017, countered that Meta has invested approximately $30 billion over the last decade on formulating and enforcing rules that prohibit certain content and behavior on its platforms.
“I think we grew almost as fast as we could throughout these years,” Rosen testified.
— An ancillary point —
Meta currently has about 40,000 people working on safety and security issues across Facebook, Instagram, WhatsApp and Facebook Messenger globally, Rosen said during cross examination.
As Meta’s counsel delved into the operations of in-house content moderation tools and platform integrity systems bearing names like Sentry, Omega, Black Hole and Karma, Boasberg interjected again.
Meta seemed to be “falling into Ms. Musser’s trap” by dwelling on a “fairly basic” point “that can be supported with certain data,” the judge said.
Attorney Leslie Pope pushed back, citing a need to respond to FTC arguments about Meta’s integrity program while also alluding to scheduled testimony from an expert later in the afternoon that would touch on similar topics.
“I think it’s an ancillary point to begin with,” Boasberg added, requesting that the parties “accelerate” their arguments on the topic.
In April, the judge denied a motion in limine from Meta to exclude “inflammatory material relating to the FTC’s legally irrelevant assertions about user mental health and how some users might misuse Meta’s services.”
“The FTC explains that it seeks to admit such material to rebut Meta’s assertion that its acquisition of Instagram was procompetitive because its ‘tools and expertise helped Instagram attack its integrity problems at scale,’” Boasberg wrote, calling evidence tending to address a central element of a party’s defense “plainly relevant.”
— Expert testimony —
Damon McCoy, a computer science professor at NYU, testified for the FTC as an expert witness on the integrity systems that online platforms use to address harmful and objectionable content and behavior.
The FTC tasked McCoy with assessing whether Meta’s acquisitions were necessary to achieve purported integrity benefits claimed by the company.
Along with the procompetitive justification cited in Boasberg’s order on Meta’s motion in limine, the company also claims that its integrity systems and know-how provided WhatsApp with “significant and unique benefits.”
McCoy also looked at whether the claimed integrity benefits could have been achieved by WhatsApp or Instagram without Meta’s acquisition.
At one point, Boasberg interjected to ask McCoy whether a document in his presentation stood for the “opposite point” he was trying to make.
Meta’s integrity techniques and know-how “were and are not unique,” according to McCoy, who said he was forced to resort to a qualitative analysis due to the lack of any quantitative integrity metrics — there were none for WhatsApp, and Instagram didn't have them until beginning in 2021, he said.
“Facebook is an industry leader in this area, and we are able to leverage their review expertise” and industry partnerships, one Instagram employee said in a May 2014 e-mail discussion with colleagues about an integrity tool known as PhotoDNA.
PhotoDNA is technology designed to prevent the online spread of child sexual abuse material, or CSAM, by creating digital fingerprints of images that are then compared to a database of known CSAM images.
“What it says there seems to contradict what you’re saying,” Boasberg told McCoy.
Pre-acquisition, Facebook had been employing PhotoDNA and Instagram had not, McCoy acknowledged. But Instagram could have obtained the tool for free on its own, he said.
“Based on my experience ... it wasn’t necessary for Instagram to have PhotoDNA,” he said.
McCoy also got a rough ride during Meta’s cross examination, at times appearing flustered and struggling to locate documents in the evidence binders in front of him.
The NYU professor and co-director of a cybersecurity center at the school said he had never previously testified as an expert witness at trial.
No industry standard for integrity currently exists, McCoy testified. After being presented with an excerpt from his expert rebuttal report, he also acknowledged that has never claimed Meta does not conform to industry best practices with regard to integrity.
During opening statements, Meta attorney Mark Hansen claimed the quality of Meta apps has improved “by every objective measure,” and said that only when a firm “exceeds benchmarks and earns money doing so, is there monopoly power” (see here).
Pope, the Meta attorney, also presented McCoy with multiple laudatory statements from Systrom, Instagram co-founder Mike Kreiger and other employees about Facebook’s integrity systems and spam-fighting tools.
Meta also used a citation in McCoy’s expert report against him. McCoy referenced an October 2014 conference presentation by Louis Brandy, who at that time had spent several years building out the company’s integrity infrastructure.
Pope played a video clip of Brandy at the same conference describing Facebook’s acquisition of Instagram in 2012 as a “supremely incredible test” for his team. That team, he said, had been working on making the company’s anti-spam and anti-abuse tools “generalizable” for new products that needed to ship quickly.
“It turned out, by the way, great,” Brandy said.
By 2014 and “without much effort," Brandy said, Facebook’s system was classifying and addressing spam and abuse on both apps and “slashing spam” like a “machete through the jungle.”
Additional testimony from McCoy is to continue tomorrow.
Please e-mail editors@mlex.com to contact the editorial staff regarding this story, or to submit the names of lawyers and advisers.