Clara Nack

Freie Journalistin, Berlin

3 Abos und 0 Abonnenten
Artikel

Coronavirus prompts EU to curb spread of disinformation | DW | 11.06.2020

The European Commission has called on tech giants such as Google, Twitter and Facebook to intensify measures to counter fake news about the coronavirus online.

"The coronavirus pandemic has been accompanied by a massive infodemic," EU foreign policy chief Josep Borrell said at a press conference in Brussels on Wednesday. "Disinformation in times of the coronavirus can kill," he added. "We have a duty to protect our citizens by making them aware of false information - and expose the actors responsible for engaging in such practices."

To prevent disinformation and raise users' awareness of propaganda, the European Commission will require social media platforms to provide monthly reports about their actions and data related to advertising. Platforms will also be called upon to cooperate more with independent fact-checkers in all of the EU's member states and in all bloc languages.

Read more: Czech civil society fights back against fake news

"We only know as much as platforms tell us," said Vera Jourova, the European Commission's vice president for values and transparency. "This is not good enough," she added. "Platforms have to open up."

Jourova said Google had removed 80 million coronavirus-related advertisements that clearly contained false information. But, she said, there are no exact figures about the harm to people's health that the spread of disinformation has so far caused.

Read more: Celebrities 'pass the mic' to COVID-19 experts

'End to trolling'

The European Union introduced its voluntary Code of Practice on Disinformation in October 2018. Jourova, who was commissioner for justice, consumers and gender equality at the time, had expressed her misgivings about introducing binding regulations.

Read more: Is the coronavirus pandemic a magic moment for social media?

Google, Facebook, Twitter and Mozilla signed on before the end of the year. Microsoft joined the accord in May 2019, and TikTok is now a party, too. The European Commission is still negotiating with WhatsApp. From the monthly reports provided since January 2019, it appears that tech companies delete 70% of the content posted.

The bloc also included this caveat: "By the end of 2019, the Commission will carry out a comprehensive assessment at the end of the Code's initial 12-month period. Should the results prove unsatisfactory, the Commission may propose further actions, including of a regulatory nature."

Thorsten Frei, the deputy chair of the Bundestag bloc of Angela Merkel's Christian Democrats and Bavaria's Christian Social Union, said the measures at the EU level were not effective enough. "Stricter regulations already exist in Germany," he said in a public statement. "By reforming the Network Enforcement Act (NetzDG) and introducing a law against hate speech, we have already introduced much stronger instruments to put an end to trolling."

Violating free expression?

The European Commission is now looking at further actions as the voluntary code does not seem to be sufficient to prevent the spread of misinformation during the pandemic. The commission is now asking citizens of EU member states, companies and online platforms to take part in an online survey and make proposals for change by September.

According to the German platform netzpolitik.org, the European Commission would prefer to contextualize fake information rather than remove it completely. Jourova has praised the step recently taken by the social media platform Twitter when it fact-checked a pair of tweets by US President Donald Trump and hid another for " glorifying violence." Facebook founder Mark Zuckerberg's refusal to do the same has triggered a huge controversy.

When Germany's NetzDG law was introduced in 2017, the UN's special rapporteur on the promotion and protection of the right to freedom of opinion and expression found multiple issues with it. "The question that arises relates to the way in which the bill seeks to achieve legitimate objectives, in particular the responsibilities it places upon private companies to regulate the exercise of freedom of expression, and whether the measures proposed by the bill would be lawful under international human rights law," David Kaye wrote at the time. "The obligations placed upon private companies to regulate and take down content raises concern with respect to freedom of expression."

DW sends out a selection of the day's news and features. Sign up here.

Others pointed out that potential fines of up to €50 million ($57 million) could deter speech. Nadine Strossen, a professor emerita at the New York Law School and the former president of the American Civil Liberties Union, said platforms would delete more content than less in view of the penalties that could be incurred.

Zum Original