US Elections: Compilation of Misinformation
Facebook Meta owners are quietly curtailing some of the protections designed to prevent voting misinformation or foreign interference in US elections as the midterm vote falls. 11 come close.
It’s a marked departure from the social media giant’s multibillion-dollar efforts to improve the accuracy of US election posts and regain trust from voters. legislators and the public after they were outraged that the company was mining people’s data and allowing misleading practices to flood its website during the 2016 campaign.
This pivot is raising alarm about Meta’s priorities and about how some may be exploiting the world’s most popular social media platforms to spread misleading statements, publish fake accounts and inciting partisan extremists.
“They don’t talk about it,” said former Facebook policy director Katie Harbath, now CEO of technology and policy firm Anchor Change. “Best case scenario: They’re still doing a lot of things behind the scenes. Worst case scenario: They pull out and we don’t know how that will play out for midterms across the board. communication.”
Since last year, Meta has closed an examination of how falsehoods are amplified in political ads on Facebook by indefinitely expelling researchers from the site.
CrowdTangle, the online tool the company makes available to hundreds of newsrooms and researchers so they can identify trending posts and misinformation on Facebook or Instagram, is down for some days. .
Public media coverage of the company’s response to election misinformation has been completely silent. Between 2018 and 2020, the company released more than 30 statements detailing how it will prevent misinformation about the US election, prevent foreign competitors from running ads or articles. Post around the vote and reduce divisive hate speech.
Top executives held Q&A sessions with reporters about the new policies. Chief Executive Officer Mark Zuckerberg has written Facebook posts promising to take down voting misinformation and authored opinion articles calling for more regulations to address foreign interference in foreign affairs. American elections through social networks.
But this year Meta released only a one-page document outlining plans for the fall election, even as potential threats to the vote remained clear. Several Republican candidates are pushing false claims about the US election on social media. In addition, Russia and China continue to carry out active social media propaganda campaigns aimed at further political division among American audiences.
Meta says that elections remain a priority and that the policies developed in recent years around election disinformation or foreign interference are now closely applied to the workings of the public sector. company.
“With each election, we incorporate what we’ve learned into new processes and have established channels to share information with our government and industry partners,” the statement said. Meta member Tom Reynolds said.
He declined to say how many employees will be on the project to protect the US election full-time this year.
During the 2018 election cycle, the company provided tours, photography, and headcount production for its election response war room. But The New York Times reports the number of Meta employees working in this year’s election has been cut from 300 to 60, a controversial figure for Meta.
Reynolds said Meta will pull hundreds of employees across 40 other corporate teams to monitor the upcoming vote along with the election team, with an unspecified number of workers.
The company is continuing many of the initiatives it developed to curb election misinformation, such as a disinformation program that started in 2016 to solicit help from news outlets. news to investigate the veracity of widespread misinformation spreading on Facebook or Instagram. The Associated Press is part of Meta’s fact-checking program.
This month, Meta also rolled out a new feature for political ads that allows the public to find insights into how advertisers target people based on their interests on Facebook and Instagram. .
However, Meta has blocked other attempts to identify election misinformation on its websites.
It stopped improving CrowdTangle, a website it powered to newsrooms around the world that provided insights into trending social media posts. Journalists, fact-checkers, and researchers have used the site to analyze Facebook content, including tracing widespread misinformation and who is responsible for it.
That tool is now “dead,” former CrowdTangle CEO Brandon Silverman, who left Meta last year, told the Senate Judiciary Committee this spring.
Silverman told the AP that CrowdTangle is working on upgrades that would make it easier to find meme’s text on the internet, which can often, for example, be used to spread half-truths and escape scrutiny. observations of fact-checkers.
“There’s really no shortage of ways you can organize this data to make it useful to various parts of the fact-checking community, newsrooms, and broader civil society,” Silverman said.
Silverman says not everyone at Meta agrees with that transparent approach. The company hasn’t rolled out any updates or new features to CrowdTangle in over a year, and it’s experienced an hour-long outage in recent months.
Meta also halted efforts to investigate how misinformation is spread through political ads.
The company has indefinitely revoked Facebook access to a pair of New York University researchers, who it said illegally collected data from the platform. The move comes hours after NYU professor Laura Edelson said she shared a plan with the company to investigate the spread of misinformation on the platform around January 6, 2021, the attack on the U.S. Capitol, is currently the subject of a House investigation.
“What we found, when we looked closely, was that their system could be dangerous for a lot of their users,” says Edelson.
Former and current Meta employees alone say that exposing the dangers surrounding the US election has generated a public and political backlash against the company.
Republicans often accuse Facebook of unfairly censoring conservatives, some of whom have been kicked out of their homes for violating the company’s rules. Meanwhile, Democrats regularly complain that the tech company hasn’t gone far enough to curb misinformation.
“It’s something very politically intense, they’re trying to avoid it rather than rush in first.” Harbath, a former Facebook policy director. “They just see it as a big pile of old headaches.”
Meanwhile, the possibility of regulations in the US no longer affects the company, with lawmakers failing to reach any consensus on what the multibillion-dollar company is subject to scrutiny.
Out of that threat, Meta’s leaders have devoted the company’s time, money, and resources to a new project in recent months.
Zuckerberg joined the rebranding and reorganization of Facebook last October, when he renamed the company Meta Platforms Inc. like the Internet brought to life, rendered in 3D.
Public posts on his Facebook page now focus on product announcements, praising artificial intelligence, and photos of him enjoying life. News of the election preparations announced in company blog posts was not written by him.
In a post by Zuckerberg last October, after a former Facebook employee leaked internal documents showing how the platform exaggerated hate and misinformation, he defended the company. . He also reminded his followers that he has pushed Congress to modernize regulations around elections for the digital age.
“I know it’s frustrating to see the good work we do misjudged, especially for those of you who are making important contributions to safety, integrity, research and products,” he wrote on Oct. 5. “But I believe it’s in the long run if we continue to strive to do what’s right and deliver experiences that improve people’s lives, which That would be better for our communities and our businesses.”
This was the last time he discussed the Menlo Park, California-based company’s election work in a public Facebook post.
Associated Press technology journalist Barbara Ortutay contributed to this report.