Facebook Morale, Hurt by Share Drop, Suffers Another Hit

On Thursday, Facebook Inc. employees returned to work in the aftermath of yet another corporate scandal.
The night before, the New York Times had reported that Sheryl Sandberg, Facebook’s chief operating officer, worked behind the scenes to prevent the company’s board and the public from understanding the full extent of Russia’s misinformation campaign on the social network.
Sheryl Sandberg speaks in Washington, D.C., on Sept. 5.
Photographer: Andrew Harrer/Bloomberg
The employees were used to the public microscope. But this time was different, employees said: The story brought readers into boardrooms and the halls of Congress where their top executives were making questionable decisions. At lunch, workers shuffled through the staff cafeterias at the company’s Menlo Park, California headquarters in quiet contemplation – more quiet than usual. But their phones were lighting up.
Most discussion at Facebook happens on the company’s workplace version of the social network, in various company groups. But when the news is about Facebook’s leadership, some employees have found it easier to talk when they’re unnamed. They used Blind, the anonymous employee chat app, to raise their concerns, according to screenshots obtained by Bloomberg. On Thursday, the conversations were full of outrage. How could Sandberg – and chief executive officer Mark Zuckerberg – have failed to see the threat to the company? And how could they have managed all of this so poorly?
“Why does our company suck at having a moral compass?” one employee asked, in a message linked to the New York Times story.
“Zuckerberg defers too much to others on issues where he needs to make a call,’’ another said, also anonymously, in the same thread.
“I’m f-ing exhausted of cleaning up after the sloppy and careless mistakes that made so many of the people responsible for them so, so rich,” said a third.
Getting rich has been a sore subject inside Facebook recently, especially for those, like the third poster, who joined long after the 2012 initial public offering. The company’s stock has been on the decline, ever since Facebook reported in July that it expected growth to slow in future years. Thursday marked a regularly scheduled vesting event for employee stock, according to people familiar with the matter. But it’s worth 34 percent less than at its peak in July of this year.
There’s frustration, according to some employees, that the things the company has to do to fix its public perception issues – like add people to review harmful content and limit data collection – are going to further depress growth, and therefore the stock price. In an October employee survey, 52 percent said they were optimistic about Facebook’s future, down from 84 percent the year before, according to numbers first unearthed in a Wall Street Journal report this week. Zuckerberg has told employees at internal meetings that they’re doing the right thing for the long term.
"It has been a difficult period, but every day we see people pulling together to learn the lessons of the past year and build a stronger company,’’ Facebook said in a statement. “Everyone at Facebook has a stake in our future and we are heads down shipping great products and protecting the people who use them.”
The Times story suggested Sandberg and Zuckerberg weren’t as involved with the serious issues facing the company as they should have been and instead were more concerned about continuing to defend Facebook’s reputation. Sandberg, particularly, was working to prevent the board and the public from knowing too much about the Russia campaign, and then leading an aggressive lobbying effort to fend off critics. Zuckerberg answered questions about his decision-making on a call with the media Thursday.
“To suggest that we weren’t interested in knowing the truth, or that we were trying to hide what we knew, or that we tried to prevent investigations is simply untrue,” Zuckerberg said.
The article focused more squarely on Sandberg, and how angry she was that then-chief of security, Alex Stamos, investigated Russia’s campaign without approval. Sandberg didn’t appear on the call, though Zuckerberg defended her leadership. "Sheryl is doing great work for the company. She’s been a very important partner to me and continues to be, and will continue to be," he said. Sandberg later said the idea that she personally stood in the way of disclosures on Russia was “just plain wrong.”
Stamos, who left the company earlier this year, said on Twitter that “there were a lot of heated discussions about what to publish and when,’’ but that the failure to understand the Russia threat was unrelated to his “getting chewed out in a meeting.”
After deciding to disclose the Russia campaign and submit to Congressional questioning, Sandberg worked to improve relationships behind the scenes with lawmakers. The company hired a Republican firm, Definers Public Affairs, which worked to discredit critics. At one point, it drew links between critics of the company and billionaire financier George Soros, who has been a target of anti-Semitic attacks. Zuckerberg said he and Sandberg were unaware of the messaging linking critics with Soros, and cut ties with the firm.
Despite the controversy, some employees on Blind defended the tactics.
“I think the reality of running a global corporation as large as FB is that you have to dirty yourself in the trenches,’’ one person said. “These all seem like pragmatic albeit ruthless and morally sketchy business decisions.’’
“Honestly, I respect Sheryl a lot more after reading this,’’ another said.

 Facebook said it’s making progress on detecting hate speech, graphic violence and other violations of its rules, even before users see and report them.
Facebook said that during the April-to-September period, it doubled the amount of hate speech it detected proactively, compared with the previous six months.
The findings were spelled out Thursday in Facebook’s second semiannual report on enforcing community standards. The reports come as Facebook grapples with challenge after challenge, ranging from fake news to Facebook’s role in elections interference, hate speech and incitement to violence in the U.S., Myanmar, India and elsewhere.
The company also said it disabled more than 1.5 billion fake accounts in the latest six-month period, compared with 1.3 billion during the previous six months. Facebook said most of the fake accounts it found were financially motivated, rather than aimed at misinformation. The company has nearly 2.3 billion users.
Facebook’s report comes a day after The New York Times published an extensive report on how Facebook deals with crisis after crisis over the past two years. The Times described Facebook’s strategy as “delay, deny and deflect.”
Facebook said Thursday it has cut ties with a Washington public relations firm, Definers, which the Times said Facebook hired to discredit opponents. Facebook CEO Mark Zuckerberg said during a call with reporters that he learned about the company’s relationship with Definers only when he read the Times report.
On community guidelines, Facebook also released metrics on issues such as child nudity and sexual exploitation, terrorist propaganda, bullying and spam. While it is disclosing how many violations it is catching, the company said it can’t always reliably measure how prevalent these things are on Facebook overall. For instance, while Facebook took action on 2 million instances of bullying in the July-September period, this does not mean there were only 2 million instances of bullying during this time.
Clifford Lampe, a professor of information at the University of Michigan, said it’s difficult for people to agree on what constitutes bullying or hate speech — so that makes it difficult, if not impossible, to teach artificial intelligence systems how to detect them.
Overall, though, Lampe said Facebook is making progress on rooting out hate, fake accounts and other objectionable content, but added that it could be doing more.
“Some of this is tempered by (the fact that) they are a publicly traded company,” he said. “Their primary mission isn’t to be good for society. It’s to make money. There are business concerns.”
Facebook also plans to set up an independent body by next year for people to appeal decisions to remove — or leave up — posts that may violate its rules. Appeals are currently handled internally.
Facebook employs thousands of people to review posts, photos, comments and videos for violations. Some things are also detected without humans, using artificial intelligence. Zuckerberg said creating an independent appeals body will prevent the concentration of “too-much decision-making” within Facebook.
Facebook has faced accusations of bias against conservatives — something it denies — as well as criticism that it does not go far enough in removing hateful content.