In a memo titled "Political Party Response to '18 Algorithm Change," a Facebook staffer wrote that "many parties, including those that have shifted strongly to the negative, worry about the long-term effects on democracy." One political party in Poland told Facebook that the platform's algorithm changes forced its social media team to shift from half positive posts and half negative posts to 80% negative and 20% positive. In April 2019, political parties in Europe complained to Facebook that the News Feed change was forcing them to post provocative content and take up extreme policy positions. "We consistently find that shares, angrys, and hahas are much more frequent on civic low-quality news, civic misinfo, civic toxicity, health misinfo, and health antivax content," the Facebook researcher wrote. In one internal memo from November 2019, a Facebook researcher noted that "Angry," "Haha," and "Wow" reactions are heavily tied to toxic and divisive content. Users were "posting ever more outrageous things to get comments and reactions that our algorithms interpret as signs we should let things go viral," according to a December 2019 memo by a Facebook researcher. Comments, messages, and reshares that included photos, videos, and links were awarded 30 points.įacebook researchers quickly uncovered that bad actors were gaming the system. A post that was reshared was also worth five points.Ĭomments on posts, messages in Groups, and RSVPs to public events awarded the content 15 points. Signaling engagement using one of the reaction buttons with the emoticons that stand for "Love," "Care," "Haha," "Wow," "Sad," and "Angry" were worth five points. When Facebook first moved toward meaningful social interactions in 2018, using the "Like" button awarded the post one point, according to one document. Each carries a different weight and content goes viral depending on how users interact with the post. There are several metrics that the News Feed algorithm considers, according to Facebook's internal documents. Users who comment on posts to express their dissatisfaction are unaware that the algorithm interprets that as a meaningful engagement and serves them similar content in the future, the report said. The document, titled "We are Responsible for Viral Content," noted that users had indicated the kind of content they wanted to see more of, but the company ignored those requests for "business reasons."Īccording to the report, internal Facebook data showed that users are twice as likely to see content that is reshared by others as opposed to content from pages they choose to like and follow. However, we know that many things that generate engagement on our platform leave users divided and depressed," a Facebook researcher wrote in a December 2019 report. "A state goal of the move toward meaningful social interactions was to increase well-being by connecting people. In 2018, Facebook altered the algorithms that populate users' news feeds to focus on what it calls "Meaningful Social Interactions" in an attempt to increase engagement.īut internal research found that engagement with posts "doesn't necessarily mean that a user actually wants to see more of something." In a statement to CBS News, a Facebook spokesperson said the project involving the conservative test user is "a perfect example of research the company does to improve our systems and helped inform our decision to remove QAnon from the platform." And they reveal that the company was aware its algorithms, which predict what posts users want to see and how likely they are to engage with it, can lead users "down the path to conspiracy theories." The three projects illustrate how Facebook's algorithms for the News Feed can steer users to content that sow divisions. A consortium of 17 U.S news organizations, including CBS News, has reviewed the redacted version of the documents received by Congress.
The internal Facebook memos analyzing the progression of these test accounts were part of thousands of pages of leaked documents provided to Congress by lawyers for Facebook whistleblower Frances Haugen. The Facebook researcher running the Indian test user's account wrote in a report that year: "I've seen more images of dead people in the past 3 weeks than I've seen in my entire life total," adding that, "the graphic content was recommended by via recommended groups, pages, videos, and posts."