Alleging that Whistle-Blower contributed to the January 6 riots on Facebook, Memo says

“We will continue to face scrutiny – some fair and some unjust,” he said in a memo. “But you should also keep keeping your head high.”

Here is Mr. Clegg’s full memo:

Our position on polarization and elections

You may have seen the recent series of articles about us published in the Wall Street Journal and the resulting public interest. This Sunday night, former employees who leaked company material to the Journal will appear in a 60-minute episode on CBS. We understand that we have a hand in polarization in the United States, and the extraordinary steps we took for the 2020 election soon slowed down and suggest that January contributed to the horrors of January.

I know some of you – especially those of you who are in the US – will be asked questions by friends and family about these things so I wanted to take a few moments to go over the weekend which I hope is some useful reference for us. Work in this important area.

Facebook and polarization

People are aware of the divisions in society and are looking for answers and ways to solve problems. Social media has had a huge impact on society in recent years, and Facebook is one of the places where most of this debate takes place. So it is natural for people to question whether it is part of the problem. But the notion that Facebook is the main cause of polarization is not supported by the facts – as Chris and Preity mentioned in their note on the issue earlier this year.

The rise of polarization in recent years has been the subject of serious academic research. In fact, there is not much consensus. But the evidence that exists does not simply support the idea that Facebook or social media is the main cause of polarization in general.

Political polarization in American pre-dates social media has increased for decades. If it is true that Facebook is the main cause of polarization, then we can expect it to grow where Facebook is popular. It’s not. In fact, polarization has decreased in many countries with simultaneous high social media use in the United States.

In particular, we expect reporting to suggest that changes in Facebook’s news feed ranking algorithm are responsible for increasing polarizing content on the platform. In January 2018, we made a ranking change to promote meaningful social interaction (MSI) – so you can see more content from friends, family and groups participating in your news feed. This change was driven by extensive internal and external research that showed that meaningful engagement with friends and family on our platform was better for people’s well-being, and we refined and refined it over time, as did all ranking metrics. Of course, everyone has a rogue uncle or an old classmate at school who has a strong or extreme opinion that you don’t agree with – that’s life – and change means you’re more likely to get into their positions as well. Nonetheless, we have developed industry-leading tools to remove hateful content and reduce the distribution of problematic content. As a result, the prevalence of hate speech on our platform has now come down to around 0.05%.

But the simple fact is that changes in the algorithmic ranking system on a social media platform cannot explain the widespread social polarization. Indeed, polarizing content and misinformation are also present on any non-algorithmic ranking platform with private messaging apps like iMessage and WhatsApp.

Elections and democracy

There is no other topic that we are more vocal as a company than our work to dramatically change our approach to elections. From 2017, we began to build new defenses, bring in new skills and strengthen our strategies to prevent interference. Today, we have more than 40,000 people in the company working on safety and security.

Since 2017, we have disrupted and removed more than 150 covert influence functions, including large democratic elections. In 2020 alone, we removed more than 5 billion fake accounts – almost all of them identified before anyone flagged us. And, from March to Election Day, we removed more than 265,000 Facebook and Instagram content in the United States for violating our voter intervention policies.

Given the extraordinary circumstances surrounding the controversial election in the epidemic, we implemented so-called “break glass” measures – and talked about them openly – before and after election day before our intent to respond to and keep specific and unusual signals we see on our platform before critics evaluate it against our policies. Are potentially violating the content.

These measures were not without trade-offs ते they are blunt tools designed to deal with specific crisis situations. Responding to a temporary threat is like closing the entire city’s roads and highways that could be hiding somewhere in a particular area. In implementing them, we know that we have a significant impact on content that does not violate our rules to prioritize public safety in times of extreme uncertainty. For example, we have limited the distribution of our videos which may be election related by our systems. It was a drastic step that helped prevent the content of a potential infringement from going viral, but it also affected the general and reasonable content as a whole, which had nothing to do with the election. We wouldn’t take this kind of crude, catch-all measurement under normal circumstances, but it wasn’t a normal situation.

When we saw a return to a more normal state, we simply withdrew this emergency measure based on careful data-based analysis. We left some of them for a long period until February this year, and others, such as not recommending civic, political or new groups, we decided to keep permanently.

Fight hate groups and other dangerous organizations

I want to be completely clear: we work to limit hate speech, and there are clear policies that restrict content that incites violence. We do not benefit from polarization, in fact, quite the opposite. We do not allow dangerous organizations, including militarized social movements or violence-prone conspiracy networks, to operate on our platforms. And we remove content that praises or supports hate groups, terrorist organizations, and criminal groups.

We are more aggressive than any other internet company in dealing with harmful content, including content that seeks to empower elections. But we have been working for years to deal with these hateful groups. We removed thousands of QAnon pages, groups and accounts from our apps, removed the original #StopTheSteal group and removed references to stop theft while running to the opening. In 2020 alone, we removed more than 30 million content that violated our policies on terrorism, and in 2020 we removed more than 19 million content that violated our policies around organized hatred. We hired Proud Boys as a hate organization in 2018 and we continued to remove their appreciation, support and representation. Between August last year and January 12 this year, we identified about 900 military organizations under our Dangerous Organizations and Individuals Policy and removed thousands of pages, groups, events, Facebook profiles and Instagram accounts associated with these groups.

This work will never be completed. There will always be new threats and new problems in America and around the world. That’s why we stay alert and vigilant – and always have to.

That is why it is sometimes suggested that there would not have been a violent uprising on January 6 without social media. To put it bluntly, the responsibility for those incidents lies with the perpetrators of violence and those who actively promoted them in politics and elsewhere. Mature democracies in which social media is widely used always hold elections – for example last week’s election in Germany – without the distorted presence of violence. We actively share with law enforcement content what we can find on our services related to these traumatic events. But the complex causes of polarization in the United States – or especially the insurgency – are easy to underestimate for technical explanations.

We will continue to deal with scrutiny – some of it fair and some of it unfair. We will continue to be asked difficult questions. And many will remain skeptical of our motives. This is what happens when you become part of a company that has significant influence in the world. You just have to be more discriminating with the help you render toward other people. We are not perfect and we do not have all the answers. That’s why we do the kind of research that is the subject of these stories. And we will continue to look for ways to respond to the feedback we hear from our users, including testing methods to ensure that political content will not take their news feed.

But you should also keep your head up. You and your team do an incredible job. Our tools and products have a huge positive impact on the world and people’s lives. And we have every reason to be proud of that work.

Leave a Comment