Why Conservatives Don’t Trust Facebook

Author: 
Coverage Type: 

Facebook asked me to conduct a survey to hear from conservatives directly. Following substantial public interest in the project and in light of policy changes Facebook has recently made, we have decided to share our findings at this time. We found conservatives’ concerns generally fall within the following six buckets:

  1. Content distribution and algorithms. Conservatives have expressed concern that bias against their viewpoints may be “baked in” to Facebook’s algorithms. In addition, interviewees argued that Facebook shouldn’t be in the business of separating fact from fiction in the news.
  2. Content policies. Facebook’s community standards prohibit hate speech, graphic violence, adult nudity, sexual activity and cruel and insensitive content. Several interviewees pointed to the highly subjective, ever-evolving nature of some of these standards, in particular the term “hate speech.”
  3. Content enforcement. Interviewees were concerned that the biases of Facebook employees who enforce the rules may result in disproportionate censoring of conservatives. Some midsize and grass-roots organizations also believe their appeals are not taken as seriously as those of larger organizations.
  4. Ad policies. In the wake of strong evidence from the US intelligence community that Russia attempted to interfere in the 2016 presidential election with fake social-media accounts and inflammatory content, Facebook required advertisers to register as “political” organizations in order to post ads with a political or policy focus. Some conservative interviewees said this new rule jeopardized their status as nonprofits under Section 501(c)(3) of the tax code.
  5. Ad enforcement. As a result of Facebook’s new, more stringent ad policies, interviewees said the ad-approval process has slowed significantly. Some fear that the new process may be designed to disadvantage conservative ads in the wake of the Trump campaign’s successful use of social media in 2016.
  6. Workforce viewpoint diversity. Several interviewees noted the overall lack of viewpoint diversity throughout Facebook’s workforce and senior management.

Facebook has made several changes that are responsive to our findings, and we understand more are being considered. For now, changes include:

  • Oversight board. Facebook announced plans in July for an oversight board to hear appeals of some more-difficult content-removal decisions. If structured to reflect accurately the diverse ideological and religious views of Facebook’s user base, the board may help ensure content decisions are made thoughtfully and free from inappropriate bias.
  • Explanations of news-feed rankings. To foster user trust in the algorithms that influence content placement, Facebook has launched transparency tools that explain to users why they see certain content on their news feeds.
  • Page transparency. Facebook has enabled page managers to see when their content has been removed for violating community standards, or when distribution of a post has been reduced because a fact-checker gave it a “false” rating.
  • Staffing. Facebook has hired four additional people devoted exclusively to working with smaller organizations to resolve questions and complaints about content decisions.
  • Ad labeling requirements. To avoid incorrectly branding ads as “political,” Facebook renamed its ads library and now refers instead to ads “about social issues, elections or politics.”
  • Ad policies. Facebook has changed its ad policies that prohibit images of patients with medical tubes as “shocking and sensational content.” This will make it easier to promote certain pro-life ads.

[Jon Kyl, a Republican former US senator from Arizona, is a senior counsel at Covington & Burling LLP.]


Why Conservatives Don’t Trust Facebook Covington Interim Report (read the report) Independent Facebook 'Bias' Audit Concludes There's Still Work to be Done (B&C)