Meta supports the tool used to control misinformation

0

On May 17, as several states held their primary elections, Jesse Littlewood searched the internet using a tool called CrowdTangle to spot false narratives he knew could change perceptions of the results.

These included damaging stories about ballots being collected and cast en masse by unauthorized people, whom misinformation peddlers called “voting mules.”

Littlewood, the vice president of campaigns for voter advocacy group Common Cause, easily came across dozens of posts showing a ‘Wanted’ poster falsely accusing a woman of being a campaign mule in Gwinnett County, Georgia .

He sounded the alarm with Facebook and Twitter.

“It was going to lead to threats and intimidation of this individual who might be an election worker, and there was no evidence that this person was doing anything illegal,” Littlewood said. “It had to be removed.”

Meta Platforms’ Facebook owns the search tool used by Littlewood, and the company has kept its plans for CrowdTangle a mystery for months.

Meta has reduced its support for the product.

The company is expected to remove it eventually and declined to say when it plans to do so.

Not knowing the future of CrowdTangle or what Meta chooses to replace it with, Littlewood said, puts future election planning at risk.

The group has thousands of volunteers working in shifts to identify fake news online, mostly with CrowdTangle.

Erin McPike, spokeswoman for Meta, said the company would continue to support researchers, with plans to create “even more valuable” tools for them.

In response to researchers’ concerns, she said the company will keep CrowdTangle alive for at least the US midterms this year.

Election officials and suffrage advocates are bracing for a repeat of the flood of disinformation that engulfed the 2020 presidential race online, leading to real-world violence during the Jan. 6 insurgency.

Kate Starbird, associate professor at the University of Washington and co-founder of the Center for an Informed Public, said that if Facebook were to shut down CrowdTangle, she hopes the company will “create a viable alternative.”

Starbird said the alternative does not exist so far. Even if it did, researchers and journalists would need time to rethink their workflows around the new tool.

Failure to provide one would “significantly limit” the ability of researchers to help others counter misinformation in real time and could lead to voter manipulation.

Common Cause’s work “would be impossible to do without a tool that looks through Facebook,” Littlewood said.

CrowdTangle also previews posts on Instagram, Twitter, and Reddit.

“And we all know that the midterm reviews are proving grounds for 2024, when the level of misinformation will be even higher.”

Researchers don’t just trust the tool, but the companies that respond to reports of harmful content they make.

Twitter removed misinformation reported by Littlewood in May, but Facebook has not responded. At least 16 of the posts remained as of mid-June.

Facebook then removed them after news outlets including ProPublica and Bloomberg News reached out.

McPike said that “the CrowdTangle product experience for the 2022 midterm elections remains the same as for the 2020 elections.”

But researchers are already seeing a difference, pointing to a buggy experience as the company siphoned off support for the tool over the past few months.

In February, Meta launched an official internal process to shut down CrowdTangle.

However, he put the plan on hold as the Digital Services Act, a landmark law in Europe that aims to provide transparency into how Facebook, YouTube and other internet services amplify divisive content, gained traction. , according to a person familiar with the matter.

CrowdTangle is still on track to eventually be shut down, the person said, with some Facebook engineers tasked with killing it.

Meta bought CrowdTangle in 2016, saying at the time that it wanted to help news publishers uncover how their content is performing on Facebook and Instagram, so they can improve their strategies.

A few months later, the company exposed Russia’s campaign to influence the 2016 election by posting on social media.

As the public debated the spread of fake news online, CrowdTangle became a tool not only for better understanding social media strategy, but also for manipulation.

The company is reportedly attempting to publicly challenge the conclusions journalists and others have drawn from research on CrowdTangle.

The executives could no longer bear to support a feature that caused so many PR crises for Meta.

The CrowdTangle team within Meta disbanded in the summer of 2021 as its dozens of employees quit or got new assignments in other parts of the company.

Meta also canceled a $40,000 grant that was intended to help two research partners use CrowdTangle data to understand the public debate around the covid-19 pandemic.

Brandon Silverman, the former CEO of CrowdTangle, left Facebook in October.

And in January this year, Meta “paused” new user access to CrowdTangle as it worked through what it said were staffing constraints.

It did not restart the process of onboarding new partners to the service.

Recently, fewer than five engineers from Facebook’s integrity team in London were working to keep CrowdTangle afloat, a person with knowledge of the matter said.

That leaves little support for the tens of thousands of organizations that use the tool in their work, including the world’s leading fact-checking organizations.

No new features have been added to CrowdTangle for over 16 months.

Before CrowdTangle disbanded, its team released new updates several times a month and major new products every six months.

Researchers are concerned that product instability will worsen during major events as computing load increases, said Cody Buntain, assistant professor and social media researcher at the New Jersey Institute of Technology.

“I would expect that load to change halfway through,” Buntain said. “There is a legitimate concern as to whether it will remain stable in the significant time frame.”

Cameron Hickey, director of the Algorithmic Transparency Institute at the National Citizenship Conference, said his group is currently in the process of compiling a comprehensive watchlist of every candidate on the ballot in 2022.

This list, which is accessed by thousands of volunteer voter advocates across the country, lives on CrowdTangle.

Meanwhile, Facebook closed CrowdTangle to groups dedicated to fighting misinformation on new hot topics, such as advocacy groups that want to fight abortion misinformation on the brink of a major lawsuit. Supreme Court that could overturn Roe v. Wade, he said.

“For a tool for transparency and search, Facebook is not adding necessary enhancements that would benefit the search and transparency community,” Hickey said.

He cited long-standing bugs in the platform and missing features, such as the ability to filter posts that have already been verified by Facebook.

Meta said that when notified of a potential issue on CrowdTangle, he resolves it as quickly as possible.

He added that the company provides another dedicated tool for its third-party fact checkers to scour its social media apps and label content that may be misleading.

Brandon Silverman, the former CEO of CrowdTangle, said the research community the team worked with had long seen the impact of data sharing, but CrowdTangle had “struggled” to tell that story in a meaningful way. wide, including inside Meta.

“Over the past few months, I think that’s started to change,” he said in an interview. “It is increasingly recognized that achieving some basic transparency must be one of the first steps forward.”

The company has tried to promote its other transparency reports, such as the widely viewed content report it distributes quarterly, which was originally rolled out to refute CrowdTangle data suggesting far-right figures consistently dominate the platform.

But researchers say a neat Meta report isn’t as revealing as a tool they can use to ask their own questions.

The company shelved the first content report it compiled when Facebook executives, including the company’s chief marketing officer, Alex Schultz, debated whether it would cause a PR issue, according to the company. New York Times.

Most likely, insiders say, Facebook will roll out a tool that mimics some of CrowdTangle’s functionality without giving users full access to its original capabilities.

The company has tasked its data transparency team with working on a privacy-friendly replacement tool, she said.

So far, his efforts are insufficient, according to the researchers.

Those who have access to a separate post-research tool for academic research say it is much less user-friendly.

Buntain, a researcher at NJIT, said researchers who want to use it need to know how to code to extract analysis from the dataset, and academics don’t know how Meta compiles the data it provides.

In fact, researchers have already uncovered an error on Facebook’s part when they uncovered a discrepancy between the data provided to its research community and the data published publicly through its widely viewed content report.

The data provided to the researchers had left out about half of US Facebook users – those who engaged enough with political pages that their political leanings were clear.

This incident showed “the value of multiple perspectives on data,” Buntain said.

CrowdTangle is unmatched in “its user-friendliness, how quickly you can get information, and how easily you can get information,” Buntain added. “It cannot be underestimated.”

Share.

Comments are closed.