On Wednesday afternoon, The New York Times published a blockbuster report—with five bylines, 50 sources, and 5,000 words—on the failures of Facebook’s management team during the past three years. It begins with Sheryl Sandberg yelling at one of her employees; it ends with her notes to self, captured by a photographer, as she sat before the Senate: “Slow, Pause, Determined.” The story, in other words, is not very flattering (and you should definitely read it) We did, and we have six follow-up questions that merit more investigation.
1) What is Sandberg’s future at Facebook?
For most of the social media company’s history, Sandberg has avoided criticism. During the past year, most of the anger at Facebook has been directed at Zuckerberg. That has started to change recently. The Wall Street Journal reported, for example, on a “swat team” that Sandberg runs, tasked with identifying and preventing future catastrophes. The Times story, though, is the first to cast her as the central antagonist.
It was Sandberg, the story says, who seethed after security executive Alex Stamos (who later left the company) disclosed to a Facebook board committee that the extent of Russian interference was still unknown and unchecked. It was Sandberg who chastised Stamos for devoting time and effort to look into the Russian campaign without company approval. It was Sandberg who sided with Joel Kaplan, vice president for public policy, about leaving the Russians out of its white paper on election interference, and it was Sandberg who encouraged Stamos to be less specific in his initial posting about Russia’s propaganda campaign. Sandberg appealed to Senator Amy Klobuchar, the Democrat from Minnesota, to dial down her attacks on Facebook. And Sandberg was the one who came out in support of the Stop Enabling Sex Trafficking Act, a decision the Times asserts was motivated in part to make other tech giants like Google look bad.
In a post responding to the newspaper, Facebook rejects that assertion. “Sheryl championed this legislation because she believed it was the right thing to do, and that tech companies need to be more open to content regulation where it can prevent real world harm,” the company wrote.
The question now is whether the woman charged with solving Facebook’s hardest problems has caused a few too many of her own.
2) What other tech company has been hiring an opposition group to smear Apple?
One of the more extraordinary parts in the report involves an opposition research group called Definers Public Affairs, run by Matt Rhoades, a former campaign manager for Mitt Romney. The firm also employs Tim Miller, a former spokesman for Jeb Bush and a contributor to Crooked Media, the company that runs Pod Save America. Facebook hired Definers to look into the funding of the company’s critics.
During this period, a conservative news website called NTK Network, which the Times says is affiliated with Definers, published a number of stories critical of Apple. But, in the Times report, Miller also says that Definers’ Apple work is funded by a third technology company. In other words, Facebook paid Definers; Facebook was fighting Apple; Definers wrote stories critical of Apple; but another technology company was paying for those stories.
Facebook ended its contract with Definers on Wednesday evening, shortly after the Times story was published. However, the company defended its work with the research firm. “The New York Times is wrong to suggest that we ever asked Definers to pay for or write articles on Facebook’s behalf—or to spread misinformation,” the company wrote in its response. “Our relationship with Definers was well known by the media—not least because they have on several occasions sent out invitations to hundreds of journalists about important press calls on our behalf.” And yet, despite the public nature of the relationship, a Facebook spokesperson couldn’t say for sure whether Zuckerberg and Sandberg were aware of Definers’ involvement.
This is not the first time Facebook has engaged in these sorts of tactics. In 2011, Facebook hired a public relations firm to plant unflattering stories about Google’s user privacy practices.
3) What will the relationship be between prominent Democrats and Facebook in the coming year?
The leadership of the Democratic Party in general has supported Facebook over the years. But as public opinion turns against the company, prominent Democrats have started to turn too. At one point in the story, the Times reporters describe Senate Minority Leader Chuck Schumer confronting Senator Mark Warner, a fellow Democrat. “In July, as Facebook’s troubles threatened to cost the company billions of dollars in market value, Mr. Schumer confronted Mr. Warner, by then Facebook’s most insistent inquisitor in Congress. Back off, he told Mr. Warner, according to a Facebook employee briefed on Mr. Schumer’s intervention. Mr. Warner should be looking for ways to work with Facebook, Mr. Schumer advised, not harm it.”
Last night, Schumer’s office offered an oddly implausible denial in a statement to CBS, saying: “Senator Schumer was worried that Facebook would bow to pressure from the right wing, who opposed Facebook’s purging of fake accounts and bots, and so he encouraged Senator Warner to make sure the Intelligence Committee prioritized focusing on the company’s issues related to disinformation and future election interference.”
It’s bizarre to suggest that Warner, a Virginian who has been the most aggressive senator in dealing with disinformation, would be soft on disinformation. This nondenial denial from Schumer suggests the Times account was probably spot on. Just this week, Schumer was re-elected as Senate minority leader, meaning he’ll continue to play a key role in crafting Democrats’ response to Facebook and other tech giants. In the wake of this report, that work will undoubtedly be scrutinized. Warner’s office declined to comment on the issue.
4) What exactly was Facebook allegedly doing with George Soros?
One of the darkest parts of the piece describes the way Facebook and Definers dealt with anti-Semitism. It says Facebook worked to paint its critics as anti-Semitic (both Sandberg and Zuckerberg are Jewish), while simultaneously working to spread the idea that billionaire George Soros was supporting its critics—a classic tactic of anti-Semitic conspiracy theorists and other extremists. Russian propagandists also have sought to tie Soros to their opponents.
In July, during a meeting of the House Judiciary Committee, members of the group Freedom from Facebook sat in the audience, holding up signs depicting Zuckerberg and Sandberg as a two-headed octopus with its tentacles wrapped around the globe. The Times reports that a Facebook executive alerted the Anti-Defamation League, which condemned the anti-Facebook group for being anti-Semitic.
But even as they were defending against supposedly anti-Semitic attacks, the Times alleges Facebook was also helping spread rumors that critics call anti-Semitic in turn. Definers consistently tried to push research (including to WIRED) about the financial ties behind groups like Freedom from Facebook and the Open Markets Institute, which are funded in part by Soros.
On Wednesday, Patrick Gaspard, president of the Open Society Foundation, which was founded by Soros, sent a letter to Sandberg saying: “As you know, there is a concerted right-wing effort the world over to demonize Mr. Soros and his foundations, which I lead—an effort which has contributed to death threats and the delivery of a pipe bomb to Mr. Soros’ home. You are no doubt also aware that much of this hateful and blatantly false and anti-Semitic information is spread via Facebook.”
Facebook hasn’t denied engaging in these tactics, but the company defended its work against Freedom from Facebook in its blog post, writing: “The intention was to demonstrate that it was not simply a spontaneous grassroots campaign, as it claimed, but supported by a well-known critic of our company. To suggest that this was an anti-Semitic attack is reprehensible and untrue.” A Facebook spokesperson told WIRED that neither Zuckerberg nor Sandberg had any idea about “the Soros stuff.”
5) Did Facebook lie about what the company knew about Russian operations on the platform during the 2016 election?
This is a complicated, tangled question. Facebook executives have testified before Congress, saying that during the summer of 2016 they knew about Russian hacking attempts and the actions of groups like Fancy Bear, or APT 28, to use Facebook to spread information stolen from politicians. But they’ve always maintained that they were unaware of the Russian propaganda campaigns run by the Internet Research Agency until the summer of 2017. As late as mid-July of 2017, Facebook told WIRED it had no evidence of Russian entities buying political ads in the US.
The Times story provides lots of damning evidence that Facebook’s top managers were less interested than they should have been in learning the full extent of Russian operations on the platform. It describes Stamos, the company’s former security chief, as running almost a rogue campaign to uncover the truth: “Acting on his own, [Stamos] directed a team to scrutinize the extent of Russian activity on Facebook.” When he tells Sandberg about the work, she blew up. “Looking into the Russian activity without approval, she said, had left the company exposed legally. Other executives asked Mr. Stamos why they had not been told sooner.”
Stamos, for his part, tweeted Thursday, saying he was “never told by Mark, Sheryl or any other executives not to investigate.” Whether he was reprimanded for doing so on his own accord, as the Times reports, is unclear.
But being insufficiently interested in something is different than lying about it. That’s what makes the timeline here so important. It appears from the Times’ reporting that Facebook’s management had three different fights about how to deal with Russian operations. The first occurred after the election, when Stamos disclosed his rogue operation to Zuckerberg and Sandberg. The second occurred in the winter of 2016, when Facebook was deciding whether to specifically name Russia in a report that Stamos published in April 2017. The third occurred in September 2017 over how to respond to the fake pages created by Russian internet operations.
The Times report gives great texture to the fights. But one element is confusing. The authors write that in January 2017, as Stamos was pushing to publish a paper on the company’s findings, Kaplan, who headed policy for Facebook, objected. According to the Times, Kaplan argued that by implicating Russia, Facebook ran the risk of appearing to side with Democrats—at a time when US intelligence agencies were already saying that Russia’s president had ordered a campaign to get Donald Trump elected. “And if Facebook pulled down the Russians’ fake pages, regular Facebook users might also react with outrage at having been deceived,” the Times writes. “His own mother-in-law, Mr. Kaplan said, had followed a Facebook page created by Russian trolls.”
The problem with that is Facebook says it didn’t even know about those Russian pages in January 2017. If it did, Facebook has lied repeatedly about what it knew and when. Facebook insists, though, that the Times confused something in the chronology. Through a spokesperson, Kaplan said his mother-in-law followed an inauthentic page originating in Macedonia, not Russia. The spread of Macedonian fake news pages was known to Facebook during the election. “They got the timing wrong on that, period,” Facebook spokesperson Andy Stone said of the Times. One of the reporters for the Times, meanwhile, says they stand by their story.
6) Has the company fundamentally changed in the past two years?
In a lot of ways, Facebook has changed dramatically. Since the fall of 2017, when Facebook first publicly acknowledged Russian interference, the company has assumed a position of contrition. Zuckerberg, the company’s leader, has been on a seemingly nonstop apology tour, telling members of the press and Congress that Facebook didn’t take a wide enough view of its responsibilities. It’s hired thousands more people on its safety and security team and is investing in automated tools to spot toxic content on the platform. Facebook is now proactively finding and suspending coordinated networks of accounts and pages aiming to spread propaganda, and telling the world about it when it does. The company has enlisted fact-checkers to help prevent fake news from spreading as broadly as it once did. Some of the changes have come at a financial cost to Facebook, like requiring political advertisers to go through a lengthy authorization process before they can pay Facebook to run ads. On Thursday afternoon, Facebook announced even more efforts to disincentivize sensationalist content and increase transparency about content removal.
The company’s central problem leading up to 2016, WIRED has argued, was a failure to recognize how the platform could be used for ill. Now, at least, executives seem to have realized this. But the real question is whether any of their fixes are enough to address what seems to be a serious problem. Even as Facebook has rolled out technical and staffing changes, the Times report shows that its ruthless efforts to protect the company’s reputation at all costs remains unchanged. That’s an issue that can’t be solved with better algorithms.
More Great WIRED Stories