platformregulation.eu: An attempt at a fundamental rights based proposal

To foster the debate about one of the most complicated digital rights issues of our time, epicenter.works releases today its first draft for a proposal on platform regulation. What regulation is needed for the digital world we live in and how can we strengthen the values that we need to safeguard in today's digital information society? The proposal, which is accessible on platformregulation.eu, is structured as a Request For Comments (RFC) and aims not to be the concrete and final answer to all these complicated questions, but to further the debate with a bold and fundamental rights based proposal.

The power of big internet cooperations like Google, Facebook or Amazon is the predominant topic in today's digital rights debate. The business decisions of these cooperations shape our daily information diet, affect a significant fraction of today's commerce and influence how democratic elections are won or lost. The regulatory appetite in Europe has grown significantly in the past decade and Europe is about to enter into a reform of the current regulatory regime, dubbed the “Digital Services Act”.

The current regulatory framework in the eCommerce directive gives platforms and other intermediaries, such as ISPs, safe harbour protections from liability for the content of their users. This status quo seems unsatisfactory to a range of stakeholders for quite different reasons. Some old-media companies want online platforms to be covered by the same rules as media publishers, in effect obliging them to pre-screen all user-generated content before it comes online. Such obligations would inadvertenly lead to automatic content filters (upload filters), which we have fought against since 2016 in the EU Copyright Directive. Any reform of the current regulatory framework should not lead to a situation were we hand online platforms even more power to police freedom of speech and privatise law enforcement roles that should be the prerogative of the state.

The problem is not so much that there is currently too much or too little content being deleted on the platforms, it is more that the quality of content moderation is poor. Outsourcing these decisions to low-wage countries without proper language skills and cultural sensitivities for the people whose speech gets moderated is a recipe for disaster.

This area of Content Moderation is just one of three chapters in the proposal we release today. We also tackle Algorithmic Accountability and Disinformation and Interoperability and Competition. The proposal was developed on the shoulders of great thinkers whose ideas we tried to synthesise into a coherent policy. They follow the categories of recommendations used in RFCs to highlight the difference in certainty with which we put a certain idea forward. We would like to invite all independent experts and stakeholders to work with us on this issue, bring their own ideas forward and express support for the project.

This proposal was developed thanks to a grant from the Austrian Chamber of Labour, which allowed our staff to dedicate time to this issue ahead of the upcoming policy debate in Brussels.

Da du hier bist

… haben wir eine Bitte an dich. Für Artikel wie diesen analysieren wir Gesetzestexte, bewerten Regierungsdokumente oder lesen Allgemeine Geschäftsbedingungen (wirklich!). Wir sorgen dafür, dass möglichst viele Menschen sich mit komplizierten juristischen und technischen Inhalten befassen und auch verstehen, dass sie große Auswirkungen auf unser Leben haben. Diese Arbeit machen wir aus der festen Überzeugung, dass wir gemeinsam stärker sind als alle Lobbyisten, Machthabende und Konzerne. Dafür brauchen wir deine Unterstützung. Hilf uns, eine starke Stimme für die Zivilgesellschaft zu sein!

Jetzt Fördermitglied werden