Facebook on Tuesday said it’s tightening requirements for publishers on its platform that are affiliated with political entities – defined as organizations, companies or other groups whose primary purpose is to influence politics, public policy or elections.
Affiliated publishers will no longer be exempt from Facebook’s authorization and disclaimer process when running ads about social issues, elections or politics. They won’t be eligible for inclusion in Facebook News or have access to news messaging on the Messenger Business Platform or the WhatsApp Business API.
The announcement came earlier in the same day Democratic presidential candidate Joe Biden named California Sen. Kamala Harris as his running mate in one of the most momentous elections in recent history in the midst of a pandemic with a bitterly divided nation.
“As we head into election season in the U.S., we recognize that there are a growing number of news publications that are connected with different types of political entities, including political parties, PACs, politicians, and other organizations that can primarily engage in the influence of public policy or elections,” Facebook said.
“Identifying politically connected publishers is a new process for us, and we will learn and adapt as needed, while continuing to make ads on Facebook more transparent and protect the integrity of elections.”
Separately — but related given the current vitriol in politics — Facebook announced it’s extending hate speech on its platform to “content depicting blackface, or stereotypes about Jewish people controlling the world or global organizations.” That news came as it published its most recent tally of how it responded to bad actors in the latest quarter.
The social media giant in center of an eternal public policy storm said it removed 35.7 million pieces of content on Facebook and Instagram from April to June – down from nearly 40 million the previous quarter as COVID-19 sent workers home, meaning more reliance on technology to flag offenders. The dip, in the latest Community Standards Enforcement Report, reinforced it needs to use people and machines to monitor its platform for content with hate speech, election interference and terrorism. The company said things are back in balance as workers trained at home or returned to offices.
Facebook founder and CEO Mark Zuckerberg himself has said that the company was slow to acknowledge meddling by Russia and others during the 2016 elections. Over 100 websites in one town in Macedonia alone produced fake news on Facebook in the final weeks of the campaign that mostly favored Republican candidate President Donald Trump.
On the Community Standards Enforcement Report, Facebook said most flags were of hate speech where actions taken surged to 22.5 million pieces of content from 9.6 million in the prior quarter. The number is global, not broken out for the U.S., and Facebook explained the bump by the addition of new languages and tools.
It removed over 7 million pieces of harmful COVID-19 misinformation from Facebook and Instagram, including posts pushing fake prevention measures or exaggerated cures.
It slapped warning labels on 98 million pieces of content.
It removed 8.7 million posts connected to extremist organizations, compared with 6.3 million in the prior period.
Facebook, which has 2.7 billion users, said it’s looking to hire an outside auditor to vet these quarterly reports starting next year.
Facebook and other tech execs were grilled by a congressional antitrust committee several week ago. Zuckerberg has been attacked by the the left and the right. The CEO has often said he prefers to err on the side of free speech and progressives in particular don’t think the company has acted quickly enough to crack down on misinformation or hate speech.
A highly advertising boycott this summer drew lots of attention to the company’s policies, but had little impact on its soaring ad revenue.