Facebook's latest pledge to curb climate change disinformation on its platform is getting a skeptical reception from activists and industry watchdogs.
Last week, the social media giant announced it would seek to promote accurate information about global warming and will donate $1 million "to support organizations working to combat climate misinformation." In addition, Facebook said it would launch a page titled the "Climate Science Information Center" that will include "new features, like quizzes, to better inform and engage our community on climate change."
Some environmental experts and other critics of the company call these attempts underwhelming.
"For a company that makes $85 billion a year, a $1 million program that outsources the problem they’ve created shows that Facebook is not serious about solving climate disinformation," Michael Khoo, a spokesman for Friends of the Earth, said in a statement.
Friends of the Earth, a nonprofit that promotes environmental causes, has pressured Facebook to stop the spread of climate change misinformation. In March, it joined with 13 other environmental groups to send Facebook CEO Mark Zuckerberg a letter demanding that the company get serious about curbing persistent myths about the scientific consensus that human-made greenhouse gas emissions are warming the planet.
"Climate change disinformation is spreading rapidly across Facebook's social media platform, threatening the ability of citizens and policymakers to fight the climate crisis," the letter said, asking the company to commit to specific targets for stopping the spread of user-generated climate disinformation.
In its announcement last week, Facebook downplayed the prevalence of climate change misinformation on its platform, even while unveiling new measures to address it.
"We're taking steps to make sure people have access to reliable information while reducing climate misinformation, even as it makes up a small amount of the overall climate content on our apps," the company said.
Facebook's business model of user-generated content is only part of the problem, activists say. Another issue is the company's willingness to promote companies promoting the use of fossil fuels.
Last month, a report by Influence Map, a think tank that tracks corporate climate lobbying, found that oil and gas companies spent millions last year on Facebook ads that received more than 431 million views on the platform.
“The company often talks about its commitment to tackling climate change, but it continues to allow its platform to be used by the fossil fuel sector to undermine science-based climate action,” Faye Holder, program manager at Influence Map, said in a statement.
Some former Facebook employees believe that the company needs to re-examine the ethics, as well as the consistency, of its messaging on climate change.
"Despite Facebook's public support for climate action, it continues to allow its platform to be used to spread fossil fuel propaganda," Bill Weihl, Facebook's former director of sustainability, said in a statement that accompanied the release of the Influence Map report. "Not only is Facebook inadequately enforcing its existing advertising policies, it's clear that these policies are not keeping pace with the critical need for urgent climate action. If Facebook is serious about its climate commitments, it needs to rethink whether it's willing to keep taking the money of fossil fuel companies."
Democrats on the House Oversight Committee are currently investigating how the fossil fuel industry spreads misinformation about how its products cause climate change.
Facebook says its plan to rely on fact-checkers will help tamp down on misinformation on the site. However, this is challenged by a recent analysis by Friends of the Earth on the myths spread on the platform surrounding February's widespread power outages in Texas.
"While mainstream and local media debunked the myth that renewable energy was to blame for the outages, right-wing outlets and fossil fuel-funded interests exploited social media to spread disinformation," the report stated. "Our disinformation analysis found that only 0.9% of the interactions with the analyzed posts carried a fact-checking label. The case reveals how Facebook and the other platforms' fact-checking programs are at best ineffective, at worst a deflection from a real solution."
Read more from Yahoo News: