


Facebook's parent company Meta has been asked by its independent content moderation board to reconsider its moderation and regulation of COVID-19 misinformation, three years after the start of the pandemic.
Meta's Oversight Board revealed its first review of Meta's COVID-19 misinformation policy on Thursday, which had been launched to determine if Facebook needed to continue reviewing specific claims about COVID-19. The third-party review organization concluded that the website would continue to moderate claims about COVID-19 for as long as the World Health Organization considered the virus an international public health emergency.
FACEBOOK USERS ELIGIBLE TO CLAIM PAYMENT FROM $725 MILLION SETTLEMENT
The board also decided the company "should begin a process to reassess each of the 80 claims it currently removes, engaging a broader set of stakeholders." These claims include whether COVID-19 exists, whether the virus caused as many deaths as public records indicate, or if masks were worsening the problem. The board also recommended that Meta "include a mechanism for dissenting views to be heard" in its review process so that differing views in the scientific community can be heard.
"Meta should listen to a range of voices on this — including dissenting voices — to ensure the right to freedom of expression is protected online and to make sure we don't default to removing more content than is necessary," Meta Oversight Board Director Thomas Hughes said in a statement. "Such a measured approach can avoid opening the floodgates to harmful content without knowing who it will reach and the impact it might have."
CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER
The board also urged Meta to be more transparent about its communications with federal governments regarding COVID-19 misinformation. The recommendation comes as the Select Subcommittee on the Weaponization of the Federal Government scrutinizes Big Tech and federal agencies over their communications regarding misinformation.
The Oversight Board launched in October 2020 as a third-party organization that was meant to review the social media company's content moderation decisions. The board has reviewed several decisions, including the temporary ban of former President Donald Trump from Facebook. Its most recent decision was to overturn Facebook's removal of a post from the Iranian protests wishing the fall of the country's leadership.