Join us
Tow Center

Get Noted

As tech companies further embrace Community Notes, the Tow Center speaks to contributors and experts about persistent issues.

June 13, 2025
Ā 

Sign up for The Media Today, CJR’s daily newsletter.

Last week, Axios reported that X is testing a new program to reduce polarization on the platform by spotlighting posts that are popular among users who normally disagree with each other. The algorithm builds on X’s Community Notes program—which Elon Musk said plays a ā€œbig partā€ in combating disinformation on X. After acquiring Twitter, in October 2022, Musk laid off a significant portion of its Trust and Safety team, both contracted and in-house. In January this year, Meta followed suit by ending its partnership with third-party fact-checkers in the US and replacing them with a Community Notes model that is now being rolled out on its platforms.

While drawing attention to posts that people with otherwise differing opinions ā€œlikeā€ or engage with could conceivably help tech companies tackle polarization, these kinds of programs still fail to address misinformation on divisive topics because they rely on people with differing opinions and motivations agreeing that a post is misleading before a note is made visible. And that often doesn’t happen.

A look at the almost two million Community Notes that X makes available to download shows that the vast majority of notes never end up being displayed, stuck instead in the limbo of the ā€œNeeds More Ratingsā€ category where notes go when raters who disagree on an issue don’t come to a consensus. This can result in long delays or, as the Tow Center has previously investigated, a failure to fact-check divisive misinformation altogether. But our new reporting also finds efforts by users on Telegram to push fact-checks that achieved an overallĀ  ā€œHelpfulā€ rating (and were therefore displayed to users) back into the ā€œNeeds More Ratingsā€ category, where they would not be seen. Volunteers who contribute to the program as well as experts who we spoke with explained the myriad ways that misleading and unverified claims go unchecked by Community Notes.

One contributor we spoke with, Zach Lewis, a twenty-one-year-old software engineer from California, has participated in the Community Notes program for more than a year. He joined the program after some of his friends fell for misinformation online during the COVID-19 pandemic and began taking hydroxychloroquine, a drug that President Trump, his son, and Fox News touted as a treatment option for the virus, despite warnings from officials and medical experts about its ill-effects and lack of evidence of its efficacy. ā€œI’ve had people affected by that [misinformation] in my personal life, and I kind of didn’t want to see that,ā€ said Lewis.Ā 

But not everyone authorized to write community notes is as well-intentioned as Lewis.Ā 

Using Junkipedia, a Web monitoring and analysis tool, we were able to find actors who tried to use Telegram to purge fact-checks on their misleading posts.

One such channel on Telegram, called Shadow of Ezra, which has the same username on X, asked subscribers in March to downvote a note on one of its X posts about former First Lady Jacqueline Kennedy. The post claimed that Kennedy, in a now well-known phone conversation with former president Lyndon Johnson, was heard using ā€œvitamin Bā€ as a code for consuming methamphetamine.Ā 

A note was proposed below the post refuting the claim. However, in a now-deleted Telegram post, Shadow of Ezra was seen telling users, ā€œPlease rate this community note as not helpful,ā€ despite there being no credible evidence that Jacqueline Kennedy sought methamphetamine under the guise of vitamin B shots.

The note was proposed the same day as the post, and achieved a ā€œhelpfulā€ rating about three hours later. Shadow of Ezra then posted on their Telegram channel of 160,000-plus subscribers, asking them to downvote the note. After about twenty-four hours, the note lost its status of helpful and is currently rated as ā€œneeds more ratings.ā€

Tow Center found at least twenty other instances where Shadow of Ezra and another channel, called The General, urged subscribers to rate Community Notes under their posts as unhelpful.

Some Junkipedia search results for the keywords ā€œplease rate this community noteā€

Tow reached out to X and Shadow of Ezra requesting comment but did not receive a response at the time of publication.

Today, X has over a million contributors across the world writing notes in multiple languages. It is also touted by Musk as the ā€œbest source of truth on the internetā€ despite mounting evidence that the platform’s efforts to fight misinformation fall short in a variety of ways.Ā 

ā€œThe problem is the approach,ā€ said Isabel Holmes,Ā  a researcher at the Center for Countering Digital Hate (CCDH), a UK-based nonprofit that studies online hate and misinformation on social media platforms, including X. (Musk sued CCDH in 2023 for its reports about the platform’s failure to act on mis- and disinformation, a complaint that was later dismissed by a California court.) Holmes suggested that X should probably have a ā€œnarrower user baseā€ of contributors vetted by the platform.Ā 

Lewis, who joined Community Notes to help combat harmful false health-related posts, is now seeing another misinformation cycle on X, this time related to the measles vaccine.

Following outbreaks across several states in the US, anti-vax accounts on X have been sharing unverified claims to stoke vaccine skepticism among users. A verified account, @stopvaccinating (over 130,000 followers), which regularly posts vaccine misinformation, has been fact-checked very few times by the community. One fact-check appeared twelve days after the account posted a misleading post about vaccines. Some accounts that post blatant misinformation about vaccines don’t get noted at all. This post by user @toobaffled, for instance, claims that the CDC reports 2,659,050 vaccine injuries and has garnered upwards of 25,000 views in a single day. But it is lacking vital context, and there is no note proposed below the post as of publication of this article.

A fact-check published by Science Feedback, a publication led by scientists and science journalists dedicated to debunking health- and climate-related misinformation online, calls the claim misleading. The fact-check by Science Feedback adds much-needed context that citing the number of reports from the US Vaccine Adverse Events Reporting System (VAERS) in this way ā€œrepresents a misuse of the database, which the official VAERS website warns against before granting access to any data.ā€ The fact-checkers note that ā€œreports alone don’t demonstrate that the vaccine caused the eventsā€ and that providers were required, in the case of COVID-19 vaccinations, to report ā€œall serious adverse events that occurred following vaccination—including hospitalization and death—’regardless of causality.'ā€

Science Feedback is certified by Poynter’s International Fact-Checking Network (IFCN), a collective of over 150 publications that combat misinformation globally, many of which were a part of Meta’s third-party fact-checking program before being sidelined for Community Notes.Ā 

Despite persistent issues with these kinds of programs, some contributors believe that Community Notes help all users participate equally. ā€œI like the idea behind Community notes.… The problem is the execution,ā€ said a former contributor of notes on X from Norway who has since left the platform because he disapproves of the way it has changed.

Holmes still believes platforms should have a traditional moderation system combined with a notes program and that content moderation should be a paid job. ā€œI think moderating is work and it’s not right to have unpaid volunteers,ā€ she said. ā€œYou won’t get the level of work you need to keep a safe platform without paying people.ā€

Has America ever needed a media defender more than now? Help us by joining CJR today.

About the Tow Center

The Tow Center for Digital Journalism at Columbia's Graduate School of Journalism, a partner of CJR, is a research center exploring the ways in which technology is changing journalism, its practice and its consumption — as we seek new ways to judge the reliability, standards, and credibility of information online.

View other Tow articles Ā»

Visit Tow Center website Ā»