Researchers warn that social media companies are lowering their defenses against foreign disinformation campaigns

If there was one thing lawmakers and social media companies agreed on in 2017, it was that no one — companies, the government, or the public — was fully prepared for how foreign adversaries might use social media networks to influence the American public during an election year. Eight years later, that consensus may have been the culmination of efforts to actually address the problem.

On Wednesday, Facebook parent company Meta will stop supporting CrowdTangle, a data tool that helps researchers, journalists and other observers spot disinformation and misinformation trends on the social network. Experts from several organizations warn that the move, combined with other decisions by social media companies to scale back data monitoring and trust and safety teams, will make it much harder to combat lies spread by hostile forces.

Meta announced last month that it would replace CrowdTangle with the “Meta Content Library,” a less powerful tool that will not be made available to media companies.

That will help China, Russia and other autocratic countries seeking to sow political division in the United States, according to Nathan Doctor, senior manager of digital methods at the Institute for Strategic Dialogue.

“With these kinds of foreign influence campaigns, the most important thing is probably to keep an eye on them. Otherwise, as we see, they can flourish. So if access to data … dries up a little bit, it becomes a lot harder in some cases to identify these kinds of things and then be responsive to them,” Doctor said Monday during an online event for the Center for American Progress.

Meta’s decision follows similar moves by other social media companies. After Elon Musk took over Twitter, now X, in 2022, he disbanded its team that monitored foreign disinformation. Snapchat and Discord have since cut their trust and safety teams by 20 to 30 percent, according to Priyanjana Bengani, a computational journalism fellow at Columbia University’s Tow Center for Digital Journalism.

But Meta is much bigger than X. Facebook has 3 billion monthly active users and Instagram, also owned by Meta, has 2 billion; while X has about 600 million. CrowdTangle’s closure will be a major blow to journalists and others who seek to understand how disinformation spreads, Davey Alba, a technology reporter for Bloomberg, said Monday.

And that is particularly alarming now, as the United States prepares for a presidential election that foreign parties are trying to influence with disinformation.

“If this tool is shut down, it will make it even more difficult for us to do our work in the run-up to the US elections,” Alba said.

US lawmakers are also growing increasingly concerned. In July, 17 of them sent a letter urging Meta to reconsider its decision, but to no avail.

That shows that social media companies don’t feel very responsible towards policymakers, the press and the public, Brandi Geurkink, director of the Coalition for Independent Technology Research, said Monday.

“In perhaps the biggest global election year ever, the fact that a company … can announce its intention to make such a decision and then face such a groundswell of opposition from civil society around the world, from lawmakers in the United States and Europe, from journalists, you name it, and still go ahead with this decision and not really respond to the criticism — that to me is the most concerning thing,” she said.

It’s a far cry from the conversations social media companies and lawmakers had eight years ago after the revelations of a Russian campaign to influence the U.S. presidential election. In October 2016, officials from top social media companies appeared before Congress to offer mea culpas.

“This is the national security challenge of the 21st century,” Senator Lindsey Graham said at the time.

Colin Stretch, Facebook’s general counsel, said he shared those concerns.

“In hindsight, we should have had a wider lens. There were signals we missed,” Stretch said in his testimony.

According to Geurkink, Meta’s decision on CrowdTangle belies promises to combat state-sponsored influence campaigns and other disinformation.

“I think what I see is dishonesty in the way that they’re putting it out there publicly,” she said. “In the last election cycle, they trained political parties, they trained NGOs on how to use this tool to do really good election monitoring work around the election. So it’s very much like bait and switch.”

Bengani said that there will certainly be many more signals missed now.

“I think we’re almost at the point where it’s worse than 2016,” she said.

At the time, Twitter and Reddit’s APIs allowed observers to track trends on social media. These were ways for researchers to access the network’s data. Other companies also funded projects to moderate online conversations and track foreign influence.

Countries can pass laws to force transparency from tech companies, but that will likely lead to a patchwork of practices and perceptions around the world, she said. The end result: It will be harder for anyone to know what’s real and what’s not online. She called it a “weird, fragmented transparency ecosystem that makes tracking trends or research across geographies incredibly difficult.”

You May Also Like

More From Author