Accurate, Focused Research on Law, Technology and Knowledge Discovery Since 2002

How Facebook’s formula fostered rage and misinformation

Washington Post – “Five years ago, Facebook gave its users five new ways to react to a post in their news feed beyond the iconic “like” thumbs-up: “love,” “haha,” “wow,” “sad” and “angry.” Behind the scenes, Facebook programmed the algorithm that decides what people see in their news feeds to use the reaction emoji as signals to push more emotional and provocative content — including content likely to make them angry. Starting in 2017, Facebook’s ranking algorithm treated emoji reactions as five times more valuable than “likes,” internal documents reveal. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook’s business. Facebook’s own researchers were quick to suspect a critical flaw. Favoring “controversial” posts — including those that make users angry — could open “the door to more spam/abuse/clickbait inadvertently,” a staffer, whose name was redacted, wrote in one of the internal documents. A colleague responded, “It’s possible.” The warning proved prescient. The company’s data scientists confirmed in 2019 that posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity and low-quality news…”

See also – NiemanLab: I’m in the consortium possessing the leaked Facebook documents. Let’s dissolve it.. “On Monday, the consortium of news organizations tasked with combing through Frances Haugen’s Facebook documents expanded its ranks to include my small, independent newsletter, Big Technology. While it’s nice to be in this consortium — which includes the AP, The New York Times, The Atlantic, and others — I now believe it’s time to dissolve it.”

Sorry, comments are closed for this post.