Facebook is a 'digital gangster': Fake news report published

MPs (BSE: MPSLTD.BO - news) calling for a code of ethics to ensure social media platforms remove harmful content from their sites have branded Facebook (Swiss: FB-USD.SW - news) "digital gangsters" in a parliamentary report.

The digital, culture, media and sport committee published its findings after an 18-month investigation into disinformation and fake news and accused Facebook (NasdaqGS: FB - news) of obstructing its inquiry.

The report warned the social media giant was using its dominance to crush rivals and prevent other platforms from competing with it.

The committee said internal Facebook documents it obtained showed the tech firm "violated" laws by selling people's private data without their permission.

"Companies like Facebook should not be allowed to behave like 'digital gangsters' in the online world, considering themselves to be ahead of and beyond the law," the report warned.

MPs also said democracy was at risk from the "malicious and relentless" targeting of citizens with disinformation.

The committee expressed concerns about "dark adverts" from unidentifiable sources, as they called for reform to electoral communications laws.

MPs said ethics guidelines need to set out what is not acceptable on social media, including harmful and illegal content that has been referred to on platforms by users or identified by the companies.

The committee wrote: "Social media companies cannot hide behind the claim of being merely a 'platform' and maintain that they have no responsibility themselves in regulating the content of their sites."

The report rounded on Facebook chief Mark Zuckerberg, who was accused of showing "contempt" towards the committee by choosing not to appear before it last year.

They said the social networking site did not seem willing to be regulated or scrutinised, and claimed its "opaque" structure seemed to be designed to "conceal knowledge of and responsibility for specific decisions".

The proposed code of ethics, which would also cover illegal content, would be overseen by an independent regulator with the power to launch legal action against companies who breach it.

The regulator could issue large fines against social media giants such as Facebook and Twitter (Frankfurt: A1W6XZ - news) under the move.

Shadow culture secretary Tom Watson said the "era of self-regulation for tech companies must end immediately".

Facebook has come under intense pressure over some of its business practices in the last year.

It follows the Cambridge Analytica scandal, a series of data breaches and concerns over fake news and other content on the site.

The committee's final report into disinformation and fake news also said electoral law was "not fit for purpose" and should be updated to reflect the move to "microtargeted" online political campaigning.

MPs called for a comprehensive review of the current rules and regulations surrounding political work during elections and referenda, and separately urged the government to put pressure on social media companies to publicise instances of disinformation.

Ministers were also asked to reveal how many investigations are being carried out into Russian interference in UK politics.

The government is expected to publish a white paper later this year on proposals to reform laws to make the internet and social media safer.

Tory MP and committee chairman Damian Collins said: "Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised 'dark adverts' from unidentifiable sources, delivered through the major social media platforms we use every day.

"Much of this is directed from agencies working in foreign countries, including Russia.

"The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.

"Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers."

Mr Watson said: "Labour agrees with the committee's ultimate conclusion: the era of self-regulation for tech companies must end immediately.

"We need new independent regulation with a tough powers and sanctions regime to curb the worst excesses of surveillance capitalism and the forces trying to use technology to subvert our democracy.

"Few individuals have shown contempt for our parliamentary democracy in the way Mark Zuckerberg has."

A government spokesman said: "The government's forthcoming white paper on online harms will set out a new framework for ensuring disinformation is tackled effectively, while respecting freedom of expression and promoting innovation.

"This week the culture secretary will travel to the United States to meet with tech giants including Google, Facebook, Twitter and Apple (NasdaqGS: AAPL - news) to discuss many of these issues."

An electoral commission spokesman added: "We agree that reform of electoral law is urgently needed.

"The UK's government must ensure that the tools used to regulate political campaigning online continue to be fit for purpose in a digital age."

Karim Palant, from Facebook UK's public policy department, said: "We share the committee's concerns about false news and election integrity and are pleased to have made a significant contribution to their investigation over the past 18 months, answering more than 700 questions and with four of our most senior executives giving evidence.

"We are open to meaningful regulation and support the committee's recommendation for electoral law reform. But we're not waiting. We have already made substantial changes so that every political ad on Facebook has to be authorised, state who is paying for it and then is stored in a searchable archive for seven years. No other channel for political advertising is as transparent and offers the tools that we do.

"We also support effective privacy legislation that holds companies to high standards in their use of data and transparency for users.

"While we still have more to do, we are not the same company we were a year ago. We have tripled the size of the team working to detect and protect users from bad content to 30,000 people and invested heavily in machine learning, artificial intelligence and computer vision technology to help prevent this type of abuse."