By Rashad Grove
Ebony
https://www.ebony.com/
Mark Zuckerberg, CEO of Meta, announced that the platform was ending its fact-checking system to promote “free expression.”
The internet can often be a chaotic place. With so much information and perhaps even more misinformation, guardrails can help protect those who are seeking truth.
Adding to the chaos, Mark Zuckerberg announced that Meta will be ending fact-checking on its platform. Fact-checking will be replaced with a community-driven system similar to what Elon Musk’s deploys on X (Formerly Twitter). According to Zuckerberg, the policy change will curb censorship and “get back to our roots around free expression.”
“We’ve seen this approach work on X—where they empower their community to decide when posts are potentially misleading and need more context, and people across a diverse range of perspectives decide what sort of context is helpful for other users to see,” Zuckerberg said in a video on Facebook. “We think this could be a better way of achieving our original intention of providing people with information about what they’re seeing—and one that’s less prone to bias.”
“Some people believe giving more people a voice is driving division rather than bringing us together,” he continued. “More people across the spectrum believe that achieving the political outcomes they think matter is more important than every person having a voice. I think that’s dangerous.”
Per the news policy, messages will be monitored by other users who will post addendums on Meta’s Community Notes. Meta also noted that it plans to continue monitoring “content related to drugs, terrorism, child exploitation, frauds and scams.”
In addition to removing fact-checkers, Meta will remove restrictions on LGBTQIA+ issues, immigration and gender.
Joel Kaplan, Meta’s new chief global affairs officer and staunch conservative, welcomes the changes because the platform had been “too restrictive.”
The question becomes, how will these changes impact Black users on Meta?
Andre E. Johnson, Professor of Rhetoric and Media Studies at the University of Memphis and Director of the Center for the Study of Rhetoric Race and Religion believes that Meta’s new policy will have a dramatic impact on Black people.
“Meta’s recent announcement is likely to have significant negative repercussions for African Americans, mainly because numerous disinformation campaigns are deliberately aimed at this community,” Johnson argued. “These targeted efforts often spread harmful narratives and misinformation, exacerbating challenges that already exist within African American communities.”
Johnson also stated how political propaganda is spread to Black audiences on the platform.
“The 2019 report from the Senate Intelligence Committee brought to light significant findings regarding the nature of Russian propaganda during the 2016 election campaign. It revealed that a considerable portion of this propaganda was specifically aimed at African American communities, with the intent to dissuade them from participating in the electoral process,” Johnson continued.
“Moreover, the report pointed out the use of insidiously racist content designed to foster discord and exacerbate tensions between different ethnic groups, highlighting the strategic manipulation of social issues to influence public perception and create division within society,” Johnson added.
Nicole Austin-Hillery, President and CEO of the Congressional Black Caucus Foundation, and Dr. Jonathan Cox, Vice President of Center Policy Analysis & Research (CPAR) at the CBCF also stated that Meta’s new policy will harm the Black community
“Based on the Congressional Black Caucus Foundation’s (CBCF) research, it seems clear that removing fact-checking from platforms like Meta and others will have disproportionate impacts on Black communities, likely resulting in the proliferation of further misinformation,” Austin Hillery and Cox noted. “One real-life consequence is Black communities receiving incorrect information (or no information) about medical issues that have detrimental outcomes, as we witnessed with Covid.
The speed at which misinformation can spread on social media makes it even more difficult to correct, and the level of exposure to misinformation is a further obstacle to this correction. Black people are particularly vulnerable to misinformation on social media in part because they are the racial group that uses social media more than all others,” they continued.
“Data show that during the pandemic, official information about COVID-19 reached fewer Black people on Facebook than any other racial group; this highlights the importance and impact of algorithmic bias, ” they stated. “One policy recommendation is to improve digital literacy in Black communities, for example, by including training on how to discern credible from non-credible sources online.”
Upon closer review of Meta’s updated policy guidelines, reveals that Meta now explicitly allows users to call each other mentally ill based on sexual orientation, gender identity, and other identifiers.
Many believe that move is part of Meta’s effort to get in good with President-elect Trump, who claimed that the company was biased toward conservatives. Last month, Meta donated $1 million to Trump’s inauguration and appointed Dana White, CEO of Ultimate Fighting Championship and a big Trump supporter, to the company’s board of directors.
While fact-checking may not have been the answer for every ill of social media, it at least offered some protections for oppressed and minoritized groups who are targeted and triggered on platforms.
Nina Jankowicz, former head of a disinformation board within the Department of Homeland Security noted that the fact-checking kept the platform from going completely topsy-turvy.
“The fact-checking program was never going to save Facebook, but it was the last bulwark to complete chaos on the platform,” Jankowicz said. “And now Mark Zuckerberg is choosing chaos.”