Meta Pulls Support for Tool Used to Keep Misinformation in Check

(Bloomberg) — On May 17, as several states held their primary elections, Jesse Littlewood searched the internet using a tool called CrowdTangle to spot the false narratives he knew could change perceptions of the results: damaging stories about ballots being collected and dropped off in bulk by unauthorized people, who the misinformation peddlers called “ballot mules.”

Littlewood, the vice president for campaigns with the voter advocacy group Common Cause, easily came across dozens of posts showing a “Wanted” poster falsely accusing a woman of being a ballot mule in Gwinnett County, Georgia. He raised alarm bells with Facebook and Twitter. “This was going to lead to threats and intimidation of this individual who may be an elections worker, and there was no evidence that this person was doing anything illegal,” Littlewood said. “It needed to be removed.” 

Meta Platforms Inc.’s Facebook owns the search tool Littlewood used, and the company has for months kept its plans for CrowdTangle a mystery.  Meta has been reducing its support for the product. The company is expected to eventually scrap it, and has declined to say when it plans to do so. Not knowing the future of CrowdTangle or what Meta chooses to replace it with, Littlewood said, endangers planning for future elections. The group has thousands of volunteers working in shifts to identify false information online, and CrowdTangle is indispensible to the process. 

Erin McPike, a Meta spokesperson, said the company will continue to support researchers, with plans to make “even more valuable” tools for them. In a response to researchers’ concerns, she said the company would keep CrowdTangle alive for at least this year’s US midterms. 

Elections officials and voting rights advocates are bracing for a repeat of the flood of misinformation that engulfed the 2020 presidential race online, resulting in real-world violence during the Jan. 6 insurrection. Kate Starbird, an associate professor at the University of Washington and co-founder of the Center for an Informed Public, said that if Facebook must power down CrowdTangle, she hopes the company would “create a viable alternative,” which she said does not exist so far, and “give researchers and journalists time to redesign their workflows around the new tool.” Not providing one would “significantly limit” the ability of researchers to help others counter real-time misinformation and could lead to voters being manipulated.

Common Cause’s work “would be impossible to do without a tool that looks across Facebook,” Littlewood said. CrowdTangle gives insight into posts on Instagram, Twitter and Reddit too. “And we all know that the midterms are testing grounds for 2024, when the level of disinformation will be even higher.” 

Researchers don’t just rely on the tool, but on the companies reacting to the harmful content reports they make. Twitter removed the misinformation Littlewood flagged in May; on Facebook, which didn’t respond to his warning, at least 16 of the posts remained in mid-June. Facebook took them down after media outlets, including ProPublica and Bloomberg News, reached out.

McPike, the spokesperson, said “the CrowdTangle product experience for the 2022 midterm elections remains the same as it was for the 2020 election.” But researchers are already seeing a difference, pointing to a buggy experience as the company has siphoned off support for the tool over the past few months.

In February, Meta started an official internal process to shut down CrowdTangle, but paused the plan as the Digital Services Act, a landmark law in Europe that aims to provide transparency into how Facebook, YouTube and other internet services amplify divisive content, gained traction, according to a person familiar with the matter. CrowdTangle is still on track to be shut down eventually, the person said, with some Facebook engineers tasked to killing it.

Meta purchased CrowdTangle in 2016, saying at the time that it wanted to support news publishers in discovering how their content was performing on Facebook and Instagram, so they could improve their strategies. A few months later, the company disclosed Russia’s campaign to influence the 2016 election by posting on social networks. As the public debated the spread of false information online, CrowdTangle became a tool not just for insight into social media strategy, but manipulation. It was uncomfortable for Meta; often, the company would try to publicly dispute the conclusions journalists and others drew from research on CrowdTangle. Executives could no longer stomach supporting a feature that resulted in so many public relations crises for Meta.

The CrowdTangle team within Meta was disbanded in the summer of 2021, with its dozens of employees either quitting or getting new assignments in other parts of the company. Meta also rescinded a $40,000 grant that aimed to help two research partners use the CrowdTangle data to understand public discussion around the Covid-19 pandemic. Brandon Silverman, the former chief executive officer of CrowdTangle, departed from Facebook in October. And in January of this year, Meta “paused” new users from getting access to CrowdTangle as it worked through what it said were staffing constraints. It has not restarted the process of onboarding new partners to the service.

As of recently, fewer than five engineers on Facebook’s London integrity team were working on keeping CrowdTangle afloat, a person with knowledge of the matter said. That leaves scant support for the tens of thousands of organizations that use the tool in their work, including leading fact-checking organizations around the world such as Agence France-Presse in France and VERA Files in the Philippines, along with hundreds of other academics and researchers, news outlets, human rights activists in places like Myanmar and Sri Lanka. 

No new features have been added to CrowdTangle in over 16 months. Before CrowdTangle’s disbanding, its team rolled out new updates several times a month, and major new products every half-year. Researchers worry that the product’s instability could become worse during major events, as the computing load increases, said Cody Buntain, an assistant professor and social media researcher at the University of Maryland. “I would expect this load to change during the midterms,” Buntain said. “There’s legitimate concern about whether it will remain stable in the important time frame.” 

Cameron Hickey, director of the Algorithmic Transparency Institute at the National Conference on Citizenship, said his group is currently in the process of putting together a comprehensive monitoring list of every candidate on the ballot in 2022 — and that this list, which thousands of voter advocate volunteers nationwide have access to, lives on CrowdTangle. Meanwhile, Facebook has kept CrowdTangle closed off to groups dedicated to fighting misinformation on new charged topics in the news, such as advocacy groups that want to combat abortion misinformation on the verge of a major Supreme Court ruling that may overturn Roe v. Wade, he said. 

“For a transparency and research tool, Facebook is not adding needed enhancements that would benefit the research and transparency community,” Hickey said. He cited long-standing bugs on the platform and missing features, such as the ability to filter for posts that have already been fact-checked by Facebook.

Meta said that when it is made aware of a potential issue on CrowdTangle, it addresses it as quickly as possible. It added that the company provides another dedicated tool for its third-party fact checkers to comb through its social media apps and label content that may be misleading.

Brandon Silverman, the former CEO of CrowdTangle, said that the research community the team worked with had long seen how impactful data sharing was, but that CrowdTangle had “struggled” with how to tell that story broadly, including inside Meta. “Over the last few months, I think that has started to shift,” he said in an interview. “There’s an increased recognition that getting to some baseline transparency has to be one of the first steps forward.”

The company has attempted to promote its other transparency reports, such as the Widely Viewed Content report it distributes every quarter, which was originally rolled out as a rebuttal to CrowdTangle data suggesting far-right personalities consistently dominate the platform. But researchers say a polished report from Meta isn’t as revealing as a tool they can use to ask their own questions. The company shelved the first content report it compiled when Facebook executives, including Alex Schultz, the company’s chief marketing officer, debated whether it would cause a public relations problem, according to the New York Times.

Most likely, insiders say, Facebook will roll out a tool that mimics some of the features of CrowdTangle without giving users full access to its original capabilities. The company has assigned its data transparency team to work on a replacement tool in a privacy-safe way, it said. So far, its efforts fall short, researchers say. Those who have access to a separate post-searching tool for academic research say it’s much less user-friendly. Buntain, the researcher at University of Maryland, said that researchers who want to use it must know how to code to extract analysis from the data set, and that academics don’t have insight into how Meta compiles the data it provides. 

In fact, researchers previously caught a mistake by Facebook when they found a discrepancy between the data it provided to its research community and the data it released publicly through its Widely Viewed Content report. The data provided to the researchers had left out about half of Facebook’s US users — the ones who engaged with political pages enough to make their political leanings clear. That incident showed “the value of multiple points of view into data,” Buntain said. 

CrowdTangle is unmatched in “its usability, the speed with which you can get insights, and the ease with which you can get insights,” Buntain added. “That can’t be overstated.”

(Updates comment in final paragraph. An earlier version corrected the university affiliation of Cody Buntain in 13th, 19th paragraphs.)

More stories like this are available on bloomberg.com

©2022 Bloomberg L.P.

Close Bitnami banner
Bitnami