Meta is rolling back its covid-19 misinformation rules in the US

Meta is rolling back its covid misinformation rules in countries like the US, where the pandemic’s national emergency status has been rescinded as recommended by its independent oversight board in April of this year, The Washington Post reported Friday morning (via Engadget).

In an update to the July announcement that it asked the Meta Oversight Board to investigate the safety of doing so, Meta cited the end of the World Health Organization’s global public health emergency declaration as the reason for the change:

Our Covid-19 misinformation rules will no longer be in effect globally as the global public health emergency declaration that triggered those rules has been lifted.

Now, the company says it will be tailoring its rules by region. In its transparency center page considering the board’s recommendations, Meta says that, because the WHO has downgraded the pandemic’s emergency status, it won’t be directly addressing some of the concerns from the board.

Among those concerns is advice that Meta reassess what misinformation it removes and take steps to increase transparency about government requests to remove covid content. Instead, Meta says its response to the board’s fourth recommendation — that the company create a process to determine the risk of its misinformation moderation policies — addresses the spirit of the first recommendation. It says it will be “consulting with internal and external experts” to gauge covid’s status around the globe and will share details about localized enforcement in “future Quarterly Updates.”

The WHO put an end to its global emergency declaration on May 5th, 2023, six months after Twitter stopped enforcing its own misinformation rules shortly after Elon Musk bought it in November 2022. Both TikTok and YouTube continue to maintain policies around covid misinformation, though YouTube recently changed its rules around election misinformation.

Credit: Source link

Comments are closed.