by Eliska Pirkova*
In 2019, the European Commission announced the Digital Services Act “landmark package” as a part of the European Digital Strategy.
The framework — which has been referred to as the “second GDPR” — can establish European rules for online platform responsibilities such as tackling disinformation and online hate speech, along with public oversight and effective enforcement.
It presents a unique chance to create systemic and harmonized regulation of gatekeeper platforms across the EU. Why then is the Commission undermining its own efforts?
It’s been clear for years that online platforms can be weaponized to damage human rights and hurt our democracies.
Governments and social media platforms have proposed fragmented and mostly self-serving options to address some of the problems with online content, while digital rights activists have been calling for deeper reform that puts people’s rights at the centre and addresses the roles states and companies should play in the online ecosystem.
Companies such as Facebook and Google have become global platform-based superstructures, resulting in an enormous power imbalance between the large platforms and the people who use them.
In recent years, continuous scandals, such as Cambridge Analytica’s election influence, and increased media coverage have made it clear to the public that companies are exercising this power and making profits without taking sufficient responsibility to safeguard people’s rights.
Current regulatory efforts at the EU level will not result in the larger harmonisation goal the EU has committed to. While the European Commission is drafting the Digital Services Act package, other sectoral legislation that seeks to respond to particular categories of illegal content online are still being proposed at the EU level.
Specifically, two proposals, the Online Terrorist Content Regulation and the Intertim Regulation on child abuse materials, raise serious fundamental rights concerns that cannot be reconciled with the harmonisation process. These elephants in the room run contrary to what the European Commission presents as its ultimate goal in platform governance policy.
The Online Terrorist Content Regulation, proposed in 2018, is at the stage of trilogue negotiations. The current text does not meet fundamental rights legal standards for protecting peoples’ right to freedom of expression and opinion, and their privacy.
The recently proposed Interim Regulation on child abuse material allows private online platforms to continue “voluntary practices” to detect child sexual abuse content. Together these proposals would set a dangerous precedent for the Digital Services Act and rest on questionable legal basis.
The European Commission used Article 114 of the Treaty on the Functioning of the European Union related to the single market — not to police and judicial cooperation — despite the fact that the goal and focus is tackling illegal content online. In practice this means that they would put private companies in charge of a matter that public authorities should handle.
Placing a broad obligation upon private actors to apply proactive measures to assess and potentially remove content, and at the same time, making them fully responsible for any potential interference with people’s fundamental rights, raises issues of compatibility with the positive obligations of member states under the EU Charter.
The fight against serious crimes, including child exploitation or terrorist content, cannot be delegated to private actors.
According to the European Commission itself, the Digital Services Act is intended to be a comprehensive package of measures for the provision of digital services in the European Union, and to address the challenges online platforms pose.
The package seeks to meet this goal by establishing clear responsibilities for online platforms that can protect people, by preventing illegal activities online and addressing the risks to our fundamental rights.
But the two examples of pending EU legislations not only contradict, but jeopardise, the European Commission’s effort to establish clear rules for platforms and reinforce safeguards for the rule of law, such as the principles of legality and legal certainty.
Importantly, the European Commission has reacted to recent legislative proposals by member states that could have a negative impact on the future regulation both from a harmonisation and a fundamental rights perspective.
As the Digital Services Act package is being finalised, proposing laws at national level that establish often overbroad and disproportionate obligations for online platforms represents an attempt to undermine the EU’s ability to reach harmonised standards.
Based on the so-called fidelity principle established by Article 4(3) of the Treaty on European Union, member states shall facilitate the achievement of the Union’s task and refrain from any measure which could jeopardise the attainment of the Union’s objectives.
While some of these developments are encouraging, it is time for the EU to stop preaching water and drinking wine.
The European Union has the chance now to address data-harvesting business models, illegal content, and illegal activities online using an approach that establishes clear responsibilities for private actors and holds them to account, while ensuring our fundamental rights are protected.
It must not ignore the elephants in the room that threaten to weaken and dismantle the reform that promises to complement the EU’s data protection efforts, and create an online ecosystem that puts people and human rights at its centre.
*Europe policy analyst at digital rights group Access Now
**first published in: www.euractiv.com