META announces major Changes on Facebook and Instagram

META, the company that owns Facebook and Instagram, has announced that it will drastically change the content review rules on these two social networks.

They announced that they will disband the teams that have so far been fact-checking and replace them with so-called “community notes,” generated by users. This system is used on X, said META CEO Mark Zuckerberg yesterday.

The announced changes will significantly alter the way posts, videos, and other content are moderated on Facebook and Instagram.

The changes were announced just before the newly elected United States (U.S.) President Donald Trump takes office. Trump and other Republicans have, as a reminder, criticized Zuckerberg and META for what they see as censorship of right-wing views.

Fact-checkers were too politically biased and destroyed more trust than they created, Zuckerberg said in a video announcing the changes.

“What started as a movement for greater inclusivity increasingly became a tool to suppress opinions and exclude people with different ideas, and it has gone too far,” said Zuckerberg.

The META chief, however, admitted that the new policy would lead to more harmful content appearing on the platforms.

The changes in moderation mark a major shift in how META handles false and misleading claims on its platforms.

In 2016, the company launched an independent fact-checking program following claims that it had failed to prevent foreign actors from using its platforms to spread disinformation and sow discord among U.S. citizens. In the years since, it has continued to struggle with the spread of controversial content on its platforms, such as misinformation about elections, vaccines, and hate speech.

META created safety teams, implemented automated programs to filter or reduce the visibility of false claims, and launched a sort of “court” for moderation decisions, known as the Oversight Board.

But now Zuckerberg is following Elon Musk, who in 2022, after buying X, then known as Twitter, disbanded the company’s fact-checking teams and introduced “community notes” generated by users as the sole method for correcting false claims on the platform.

The company also plans to adjust its automated systems that identify rule violations, which it says have resulted in “over-censoring too much content.” The systems will now focus only on detecting illegal and “major” violations such as terrorism, child sexual exploitation, drugs, and fraud. Other issues will have to be reported by users before the company evaluates them.

Zuckerberg said yesterday that META’s complex content moderation systems have mistakenly resulted in removing too much content that does not violate the rules from the platform. For example, if the systems make an error in 1% of cases, it could affect millions out of the company’s more than 2 billion users.

Share This Article
Leave a Comment

Leave a Reply Cancel reply

Exit mobile version