To foster the debate about one of the most complicated digital rights issues of our time, epicenter.works releases today its first draft for a proposal on platform regulation. What regulation is needed for the digital world we live in and how can we strengthen the values that we need to safeguard in today's digital information society? The proposal, which is accessible on platformregulation.eu, is structured as a Request For Comments (RFC) and aims not to be the concrete and final answer to all these complicated questions, but to further the debate with a bold and fundamental rights based proposal.

The power of big internet cooperations like Google, Facebook or Amazon is the predominant topic in today's digital rights debate. The business decisions of these cooperations shape our daily information diet, affect a significant fraction of today's commerce and influence how democratic elections are won or lost. The regulatory appetite in Europe has grown significantly in the past decade and Europe is about to enter into a reform of the current regulatory regime, dubbed the “Digital Services Act”.

The current regulatory framework in the eCommerce directive gives platforms and other intermediaries, such as ISPs, safe harbour protections from liability for the content of their users. This status quo seems unsatisfactory to a range of stakeholders for quite different reasons. Some old-media companies want online platforms to be covered by the same rules as media publishers, in effect obliging them to pre-screen all user-generated content before it comes online. Such obligations would inadvertenly lead to automatic content filters (upload filters), which we have fought against since 2016 in the EU Copyright Directive. Any reform of the current regulatory framework should not lead to a situation were we hand online platforms even more power to police freedom of speech and privatise law enforcement roles that should be the prerogative of the state.

The problem is not so much that there is currently too much or too little content being deleted on the platforms, it is more that the quality of content moderation is poor. Outsourcing these decisions to low-wage countries without proper language skills and cultural sensitivities for the people whose speech gets moderated is a recipe for disaster.

This area of Content Moderation is just one of three chapters in the proposal we release today. We also tackle Algorithmic Accountability and Disinformation and Interoperability and Competition. The proposal was developed on the shoulders of great thinkers whose ideas we tried to synthesise into a coherent policy. They follow the categories of recommendations used in RFCs to highlight the difference in certainty with which we put a certain idea forward. We would like to invite all independent experts and stakeholders to work with us on this issue, bring their own ideas forward and express support for the project.

This proposal was developed thanks to a grant from the Austrian Chamber of Labour, which allowed our staff to dedicate time to this issue ahead of the upcoming policy debate in Brussels.

Since you're here

… we have a small favour to ask. For articles like this, we analyse legal texts, assess official documents and read T&Cs (really!). We make sure that as many people as possible concern themselves with complicated legal and technical content and understand the enormous effects it has on their lives. We do this with the firm conviction that together we are stronger than all lobbyists, powerful decision makers and corporations. For all of this we need your support. Help us be a strong voice for civil society!

Become a supporter now!

Related stories: