1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites
Digital WorldGermany

Demagogues, digitalization and the threat to democracy

Matthias von Hein
February 10, 2021

The internet has become more important than ever in the pandemic. But fake news, online threats and insults are leading to an "erosion of democracy across the board," and solutions are not easy to implement.

https://p.dw.com/p/3pAgU
Symbol | Hate speech
Many fear hate speech is eroding democratic discussionImage: imago-images/photothek/T. Trutschel

Germany is heading toward an important election, and a major problem. Not the coronavirus, though that has certainly exacerbated the problem. Many more people are spending much more time online, where fake news, hate, and incitement are rampant.

On Tuesday, Germany's Justice Minister Christine Lambrecht described a toxic debate culture on the internet that was endangering democracy. There is no democracy without a free exchange of opinions, the Social Democrat added at the opening of a virtual conference on "Digital Platforms and Society."

The minister emphasized that social networks have a special responsibility. Almost a third of all users of platforms like Facebook and Twitter have already come into contact with fake news, incitement, hate postings or threats, Lambrecht said, quoting a YouGov survey conducted on behalf of her ministry.

German Justice Minister Christine Lambrecht
German Justice Minister Christine Lambrecht is confident in the new lawImage: Henning Schacht/Pool/Getty Images

The Justice Ministry organized the conference with the industry association BitKom to mark "Safer Internet Day," a day of action initiated by the EU Commission. State Justice Secretary Christian Kastorp painted a bleak picture, before arguing that "digital arsonists" must be consistently held accountable. This requires fair, up-to-date and assertive European platform regulation, he argued, such as the one offered Digital Services Act presented by the EU Commission – though this too needed to be "tightened up" in some areas.

Reshaping digitized society

What became clear in the various panels was that the rapid development of online platforms is forcing societies to find new compromises between the poles of control and openness, of operator responsibility and user responsibility, between deletion of harmful content after the fact and regulation that curbs abuse in advance.

Platform operators are now feeling some fierce criticism. The negative side effects of social media can no longer be ignored, and nor can the enormous power they wield. This is probably one of the reasons why digital corporations are apparently coming round. Google parent company Alphabet, for example, was represented twice at the conference: once by Google employee Eveline Metzen, who spoke of the "responsible expansion" of the platforms and the "positive power of openness," and by Sabine Frank from YouTube.

HateAid: Help against hate speech

YouTube's influence can hardly be overestimated. According to its own data, some two billion users watch over a billion hours of videos every day, in 80 languages and more than 100 countries. It is the world's second most frequently accessed website after Google's search engine, a global hub that is used particularly intensively by young people.

That last fact makes it all the more worrying that there have been accusations that YouTube algorithms steer users toward more extreme content in order to keep them on the site longer. The truth of the information presented in such videos appears to play little role.

These are accusations that are made against virtually all social networks. After all, internet corporations are among the most valuable companies in the world precisely because they must convert users' time and attention into advertising revenue.

But asked by DW about these accusations, Sabine Frank said it was a widespread misconception that big platforms push people towards extremist content. Since the platform is financed by advertising, advertisers don't want to appear associated with controversial content. She claimed YouTube actually invests a lot in making such content less visible.

Fake reports about vaccinations

Recommendation for climate skeptics

YouTube has recently taken steps to recommend harmful videos less often, Joachim Allgaier told DW. A professor of communication and digital society, Allgaier researches YouTube and other social networks. He noticed in 2019 that the video platform frequently recommends videos denying human-made climate change in searches on the topic.

As recently as January 2020, a report by activist group Avaaz said YouTube's algorithm was driving users to climate deniers – in violation of its own guidelines. The creators of the videos were even profiting from the associated advertising. Ironically, the algorithms even placed Greenpeace advertisements before videos by climate change deniers.

Now that problem is to some extent being dealt with, Allgaier is concerned about another issue: That people already trapped in disinformation bubbles are migrating to new and wholly unregulated platforms, some encrypted like the messenger service Telegram. "We have no idea what's happening," said the researcher. "And the tone there is perhaps a bit sharper and more aggressive than in the more broadly-used platforms."

A survey by BitKom, also presented on Tuesday, found that one in six users of social media in Germany has already been a victim of hate speech. At the digital conference, Anna-Lena von Hodenberg, managing director of the organization HateAid, which aims to help those affected by hate on the internet, said that this often affects those who are socially active.

Public attacks online intimidate others and silence them, von Hodenberg said, before diagnosing an "erosion of democracy across the board."

Meanwhile, Justice Minister Lambrecht is convinced: "A better digital world is possible." She is hoping that concrete improvements will emerge from tightening the German law against hate speech and incitement.

In future, she hopes social networks will be obliged not only to delete particularly serious posts containing neo-Nazi propaganda, incitement of the people or threats of murder and rape, for example, but to report them immediately to the Federal Criminal Police. But it is questionable whether this will still come into force in the election year of 2021.

This article was translated from German.