THE NEW YORK TIMES

No more apologies: Inside Facebook’s push to defend its image

No more apologies: Inside Facebook’s push to defend its image

Mark Zuckerberg, Facebook’s chief executive, signed off last month on a new initiative code-named Project Amplify.

The effort, which was hatched at an internal meeting in January, had a specific purpose: to use Facebook’s News Feed, the site’s most important digital real estate, to show people positive stories about the social network.

The idea was that pushing pro-Facebook news items – some of them written by the company – would improve its image in the eyes of its users, three people with knowledge of the effort said. But the move was sensitive because Facebook had not previously positioned the News Feed as a place where it burnished its own reputation. Several executives at the meeting were shocked by the proposal, one attendee said.

Project Amplify punctuated a series of decisions that Facebook has made this year to aggressively reshape its image. Since that January meeting, the company has begun a multipronged effort to change its narrative by distancing Zuckerberg from scandals, reducing outsiders’ access to internal data, burying a potentially negative report about its content and increasing its own advertising to showcase its brand.

The moves amount to a broad shift in strategy. For years, Facebook confronted crisis after crisis over privacy, misinformation and hate speech on its platform by publicly apologizing. Zuckerberg personally took responsibility for Russian interference on the site during the 2016 presidential election and has loudly stood up for free speech online. Facebook also promised transparency into the way that it operated.

But the drumbeat of criticism on issues as varied as racist speech and vaccine misinformation has not relented. Disgruntled Facebook employees have added to the furor by speaking out against their employer and leaking internal documents. Last week, The Wall Street Journal published articles based on such documents that showed Facebook knew about many of the harms it was causing.

So Facebook executives, concluding that their methods had done little to quell criticism or win supporters, decided early this year to go on the offensive, said six current and former employees, who declined to be identified for fear of reprisal.

“They’re realizing that no one else is going to come to their defense, so they need to do it and say it themselves,” said Katie Harbath, a former Facebook public policy director.

The changes have involved Facebook executives from its marketing, communications, policy and integrity teams. Alex Schultz, a 14-year company veteran who was named chief marketing officer last year, has also been influential in the image reshaping effort, said five people who worked with him. But at least one of the decisions was driven by Zuckerberg, and all were approved by him, three of the people said.

Joe Osborne, a Facebook spokesperson, denied that the company had changed its approach.

“People deserve to know the steps we’re taking to address the different issues facing our company – and we’re going to share those steps widely,” he said in a statement.

For years, Facebook executives have chafed at how their company appeared to receive more scrutiny than Google and Twitter, said current and former employees. They attributed that attention to Facebook’s leaving itself more exposed with its apologies and providing access to internal data, the people said.

So in January, executives held a virtual meeting and broached the idea of a more aggressive defense, one attendee said. The group discussed using the News Feed to promote positive news about the company, as well as running ads that linked to favorable articles about Facebook. They also debated how to define a pro-Facebook story, two participants said.

That same month, the communications team discussed ways for executives to be less conciliatory when responding to crises and decided there would be less apologizing, said two people with knowledge of the plan.

Zuckerberg, who had become intertwined with policy issues including the 2020 election, also wanted to recast himself as an innovator, the people said. In January, the communications team circulated a document with a strategy for distancing Zuckerberg from scandals, partly by focusing his Facebook posts and media appearances on new products, they said.

The Information, a tech news site, previously reported on the document.

The impact was immediate. On January 11, Sheryl Sandberg, Facebook’s chief operating officer – and not Zuckerberg – told Reuters that the storming of the US Capitol a week earlier had little to do with Facebook. In July, when President Joe Biden said the social network was “killing people” by spreading Covid-19 misinformation, Guy Rosen, Facebook’s vice president for integrity, disputed the characterization in a blog post and pointed out that the White House had missed its coronavirus vaccination goals.

“Facebook is not the reason this goal was missed,” Rosen wrote.

Zuckerberg’s personal Facebook and Instagram accounts soon changed. Rather than addressing corporate controversies, Zuckerberg’s posts have recently featured a video of himself riding across a lake carrying an American flag, with messages about new virtual reality and hardware devices. (After this article, which described Zuckerberg as riding an electric surfboard, was published, he wrote on Facebook that it was actually “a hydrofoil that I’m pumping with my own legs.”)

Facebook also started cutting back the availability of data that allowed academics and journalists to study how the platform worked. In April, the company told its team behind CrowdTangle, a tool that provides data on the engagement and popularity of Facebook posts, that it was being broken up. While the tool still exists, the people who worked on it were moved to other teams.

Part of the impetus came from Schultz, who had grown frustrated with news coverage that used CrowdTangle data to show that Facebook was spreading misinformation, said two people involved in the discussions.

For academics who relied on CrowdTangle, it was a blow. Cameron Hickey, a misinformation researcher at the National Conference on Citizenship, a nonprofit focused on civic engagement, said he was “particularly angry” because he felt the CrowdTangle team was being punished for giving an unfiltered view of engagement on Facebook.

Schultz argued that Facebook should publish its own information about the site’s most popular content rather than supply access to tools like CrowdTangle, two people said. So in June, the company compiled a report on Facebook’s most-viewed posts for the first three months of 2021.

But Facebook did not release the report. After the policy communications team discovered that the top-viewed link for the period was a news story with a headline that suggested a doctor had died after receiving the Covid-19 vaccine, they feared the company would be chastised for contributing to vaccine hesitancy, according to internal emails reviewed by The New York Times.

A day before the report was supposed to be published, Schultz was part of a group that voted to shelve the document, according to the emails. He later posted an internal message about his role at Facebook, which was reviewed by The Times, saying, “I do care about protecting the company’s reputation, but I also care deeply about rigor and transparency.”

Facebook also worked to stamp out employee leaks. In July, the communications team shuttered comments on an internal forum that was used for companywide announcements. “OUR ONE REQUEST: PLEASE DON’T LEAK,” read a post about the change.

At the same time, Facebook ramped up its marketing. During the Olympics this summer, the company paid for television spots with the tagline “We change the game when we find each other,” to promote how it fostered communities. In the first half of this year, Facebook spent a record $6.1 billion on marketing and sales, up more than 8% from a year earlier, according to a recent earnings report.

Weeks later, the company further reduced the ability of academics to conduct research on it when it disabled the Facebook accounts and pages of a group of New York University researchers. The researchers had created a feature for web browsers that allowed them to see users’ Facebook activity, which 16,000 people had consented to use. The resulting data had led to studies showing that misleading political ads had thrived on Facebook during the 2020 election and that users engaged more with right-wing misinformation than many other types of content.

In a blog post, Facebook said the NYU researchers had violated rules around collecting user data, citing a privacy agreement it had originally struck with the Federal Trade Commission in 2012. The FTC later admonished Facebook for invoking its agreement, saying it allowed for good-faith research in the public interest.

Laura Edelson, the lead NYU researcher, said Facebook cut her off because of the negative attention her work brought. “Some people at Facebook look at the effect of these transparency efforts and all they see is bad PR,” she said.

The episode was compounded this month when Facebook told misinformation researchers that it had mistakenly provided incomplete data on user interactions and engagement for two years for their work.

“It is inconceivable that most of modern life, as it exists on Facebook, isn’t analyzable by researchers,” said Nathaniel Persily, a Stanford University law professor, who is working on federal legislation to force the company to share data with academics.

In August, after Zuckerberg approved Project Amplify, the company tested the change in three US cities, two people with knowledge of the effort said. While the company had previously used the News Feed to promote its own products and social causes, it had not turned to it to openly push positive press about itself, they said.

Once the tests began, Facebook used a system known as Quick Promotes to place stories about people and organizations that used the social network into users’ News Feeds, they said. People essentially see posts with a Facebook logo that link to stories and websites published by the company and from third-party local news sites. One story pushed “Facebook’s Latest Innovations for 2021” and discussed how it was achieving “100 percent renewable energy for our global operations.”

“This is a test for an informational unit clearly marked as coming from Facebook,” Osborne said, adding that Project Amplify was “similar to corporate responsibility initiatives people see in other technology and consumer products.”

Facebook’s defiance against unflattering revelations has also not let up, even without Zuckerberg. On Saturday, Nick Clegg, the company’s vice president for global affairs, wrote a blog post denouncing the premise of The Journal investigation. He said the idea that Facebook executives had repeatedly ignored warnings about problems was “just plain false.”

“These stories have contained deliberate mischaracterizations of what we are trying to do,” Clegg said. He did not detail what the mischaracterizations were.

[This article originally appeared in The New York Times.]

Subscribe to our Newsletters

Enter your information below to receive our weekly newsletters with the latest insights, opinion pieces and current events straight to your inbox.

By signing up you are agreeing to our Terms of Service and Privacy Policy.