Tech

5 people defining platform liability

This article is part of POLITICOs Changemakers series, looking at the players driving European policy.

The time has come for Google, Facebook, Amazon and TikTok to face new rules in Europe.

Under the current legal framework, platforms are not legally responsible for hosting illegal content, but are required to remove such material once it is flagged. That may change.

The European Commission has pledged to present by the end of the year a new framework that will “increase and harmonize the responsibilities of online platforms,” setting EU-wide requirements for how companies police illegal content online in the Digital Services Act package.

Lawmakers argue that a change is overdue. Lobbyists are on the warpath. Here are five people and organizations whose voices will matter in defining the new rules.

Thierry Breton — European commissioner for the internal market

European Commissioner in Charge of Internal Market Thierry Breton | Olivier Hoslet/EFE via EPA

Tasked by Commission President Ursula von der Leyen with leading the work on the Digital Services Act, the Frenchman manages the Commissions digital department, which will hold the pen.

Breton told MEPs in his November confirmation hearing that the Commission would not touch the platforms limited liability regime. He later said that all options were still on the table. (Commission Executive Vice President for Digital Margrethe Vestager similarly backtracked and said the liability question was still open.)

In February, Breton dismissed Facebook CEO Mark Zuckerbergs proposal to create a third status for the social media giant that would fall between telecoms provider and publisher. “They have a responsibility,” Breton told reporters after meeting Zuckerberg in Brussels. “Its up to them to see the impact of their responsibility before we tell them so.”

The commissioner also said he is against general monitoring obligations — or the requirement to check all content before it is posted — but argues that “stricter control” is needed when it comes to hate speech, terrorist content or fake news online.

Breton wont work on platform liability alone. Executive Vice President Margrethe Vestager, Vice President for Values and Transparency Věra Jourová and Justice Commissioner Didier Reynders will also chip in.

Christine Lambrecht — Germanys justice minister

German Justice Minister Christine Lambrecht | Tobias Schwarz/AFP via Getty Images

Christine who? That was the most common reaction in Berlin when Germanys Social Democrats (SPD) announced last summer that the lawyer and longtime member of parliament would take over as justice minister from her charismatic predecessor Katarina Barley, who was leaving to become a member of the European Parliament.

Today, there are few in the European tech industry who dont know her name.

Since taking office in June, Lambrecht has quietly introduced two pieces of legislationthat would toughen Germanys Netzwork Enforcement Act, better known as NetzDG, which governs speech online. Both of them are expected to pass parliament this spring.

The first law would force big social media companies to proactively report potentially criminal content on their platforms to law enforcement. The second aims to make it easier for users to report illegal content and challenge content decisions by the internet platforms. It also requires companies to disclose more information than was previously required in their biannual transparency reports, including details about which groups of people are particularly affected by hate speech or how companies are using artificial intelligence to detect harmful content.

Lambrecht has hinted that Berlins rule book could serve as a role model for other EU countries. “In many European countries, populists and extremists are rioting against democracy, dissenters and minorities,” she said. “The platforms are the same, and the racist and anti-Semitic messages are similar.”

Cédric O — Frances junior digital minister

French Digital Affairs Minister Cedric O | Yonhap/EFE via EPA

Like Germany, France is at the forefront of online content regulation and hopes to influence future EU rules.

The country adopted legislation against fake news in 2018, and its parliament is working on legislation that would require internet and social media platforms to remove flagged hate speech within 24 hours, or face fines. Though the European Commission slammed the draft law and asked France to postpone the project, Paris is moving forward — and O, the countrys junior digital minister, is planning to take the fight to Brussels.

In February, the French government announced the formation of a working group on the Digital Services Act. It will focus on “structural platforms,” as well as on platforms “responsibility” when it comes to online hate speech and consumer protection on marketplaces, said Finance Minister Bruno Le Maire and O.

France wants the EU to leave some margin of maneuver for national governments. “On matters of national responsibility such as the regulation of hate speech, a certain margin of appreciation must be left to states,” O recently told POLITICO. “French, Swedish and European culture dont share the same idea of how we should balance freedom of expression and [citizens] protection.”

Tiemo Wölken — MEP from the Socialists and Democrats group, Germany

politico