Yesterday Open Democracy reported on how Investigate Europe has revealed, that the EU’s instruments against disinformation remained largely ineffective. A day before the European elections, tech giants, Facebook and Google have been alleged to sabotage the designing of EU regulation for fake news and disinformation.
According to a new testimony which Investigate Europe collected from insiders, Google and Facebook pressured and “arm-wrestled” a group of experts to soften European guidelines on online disinformation and fake news. The EU’s expert group met last year as a response to the spread of fake news and disinformation seen in the Brexit referendum and in the US election of President Donald Trump in 2016. The task for the experts was to help prevent the spread of disinformation, particularly at the time of European parliamentary elections now.
It was in March last year that the expert group’s report was published and then the same year in September the EU Code of Practice on Disinformation was announced where the platforms had agreed to self-regulate following common standards. The European Union pushed platforms like Google, Facebook and Twitter to sign a voluntary Code of Practice. The tech companies did commit themselves to name their advertising clients and to act against fake accounts, ie false identities on their platforms. They had also agreed to investigate spread of disinformation and fake news on the platforms.
In addition, representatives from Facebook, Google and Twitter had also agreed to submit monthly reports to the EU Commissioners. “It’s the first time in the world that companies have voluntarily agreed to self-regulatory measures to combat misinformation,” the commission proclaimed.
The expert group confirmed to Investigate Europe that Facebook and Google representatives undermined the work and opposed the proposals to be more transparent about their business models.
During the group’s third meeting in March 2018, “There was heavy arm-wrestling in the corridors from the platforms to conditionalise the other experts”, says a member of the group, under the condition of anonymity. Another member Monique Goyens – director-general of BEUC, says, “We were blackmailed,”. Goyens further added that, “We wanted to know whether the platforms were abusing their market power,”. In response to this Facebook’s chief lobbyist, Richard Allan said to her: “We are happy to make our contribution, but if you go in that direction, we will be controversial.”
He also threatened the expert group members saying that if they did not stop talking about competition tools, Facebook would stop its support for journalistic and academic projects.
Google influenced and bribed the expert group members
Goyens added that the Google did not have to fight too hard as they had influenced the group member in other ways too. She added that 10 organisations with representatives in the expert group received money from Google. One of them was Reuters Institute for the Study of Journalism, at the University of Oxford. By 2020, the institute will have received almost €10m from Google to pay for its annual Digital News Report. A number of other organisations represented on the group did also receive funding from the Google Digital News Initiative, including the Poynter Institute and First Draft News.
Ska Keller, the German MEP said, “It’s been known for some time that Google, Facebook and other tech companies give money to academics and journalists. There is a problem because they can use the threat of stopping this funding if these academics or journalists criticise them in any reporting they do.”
Code of practice was not delivered as strongly as it was laid down
A year later, the code of conduct with the platforms is no more than voluntary. The platforms agreed to take stronger action against fake accounts, to give preference to trustworthy sources and to make it transparent to their users but the progress has been limited.
The results of code of practice criticism came from a ‘Sounding Board’ that was convened by the European Commission to track the proposals drawn up in response to the expert group’s report. The Sounding Board, which included representatives from media, civil society and academia, said that the code of practice “contains no common approach, no clear and meaningful commitments, no measurable objectives or KPIs, hence no possibility to monitor process, and no compliance or enforcement tool.
“It is by no means self-regulation, and therefore the platforms, despite their efforts, have not delivered a code of practice. More systematic information is needed for the Commission to assess the efforts deployed by the online platforms to scrutinise the placement of ads and to better understand the effectiveness of the actions taken against bots and fake accounts,” four commissioners said in a statement issued in March.
Goyens concluded saying, “The code of conduct was total crap. It was just a fig leaf. The whole thing was a rotten exercise. It was just taking more time, extending more time.”
However there are reactions on Twitter about the story that it might be a disinformation in itself. A twitter user said that Facebook and Google opposed sharing their algorithms for ranking content, and how this would help EU in fighting disinformation is unknown.
Reading the story I wonder if it not itself is disinformation. What Facebook and Google opposed was sharing their algorithms for ranking content. It is not obvious to me how knowing about that would have helped the EU fight disinformation.
— Jesusaurus Rex (@Je5usaurus_Rex) May 21, 2019
While discussions on Hacker News revolve around the context of fake news with the history of propaganda and censorship, one of the user commented, “The thing that makes me very concerned is the lack of context in these discussions of fake news of the history and continuing use of government propaganda.
I hope young people who are familiar with “fake news” but not necessarily as familiar with the history of propaganda and censorship will study that history.
The big issue is there is enthusiasm for censorship, and the problem with censorship is who gets to decide what is real information and what is fake. The interests with the most power will have more control over information as censorship increases.
Because the same power that is supposedly only used to suppress propaganda from some other country is used to suppress internal dissent or criticism.
This is actually very dangerous for multiple reasons. One big reason is that propaganda (internal to the country, i.e. by the same people who will be deciding what is fake news) is usually critical in terms of jump-starting and maintaining enthusiasm for wars.”