First Analysis of the Austrian Anti-Hate Speech Law (NetDG/KoPlG)
On September 3rd the Austrian government released a legislative package to tackle online hate speech. Besides a comprehensive justice reform, the package also contains a bill that creates new obligations for online platforms to remove illegal user-generated content. This article offers a first analysis of the so called Kommunikationsplattformen-Gesetz (KoPl-G) and the many similarities it has to the German Netzwerkdurchsetzungsgesetz (NetzDG).
Aiming a shotgun at Google and killing half the internet in the process
The public debate about the Austrian NetzDG was mostly focused around the question which platforms should be in the scope of the new law. In an open letter together with Wikipedia we asked the government to protect smaller online platforms with a global turnover threshold and to consider platforms oriented at the public good or with effective community moderation systems. Most of the debate focused around the considerable deficiencies in the content moderation practices of global internet corporations like Google, Facebook, and TikTok and the chairwoman of the Austrian Green party Sigrid Maurer said in an interview that there are a far fewer problems with online hate speech on small or local platforms. Yet, the draft bill has no safeguards for smaller platforms and community moderation systems are incompatible with the new obligations.
The scope of the draft bill includes all online platforms if their main purpose is the exchange of messages, videos, pictures or audio files, as long as they have at least 100,000 users in Austria OR a turnover of at least 500,000 euros. There are specific exceptions for non-profit-oriented online encyclopedias (Wikipedia), comment sections of news websites (derStandard, Krone), and e-commerce platforms that convey services or goods (Amazon, Geizhals, MyHammer).
We have seen this logic already in the EU copyright directive: First, there is a very broad definition that includes all kinds of platforms and then very specific exceptions are added: for all organisations that managed to send a lobbyist to the lawmakers in time. Unlike in Germany, the Austrian NetzDG is not restricted to for-profit social networks, but instead affects almost all types of online platforms. Due to the broad definition, the law concerns for example the chat function of online games like World of Warcraft, recipe platforms, and open source development platforms. But the main problem lies in the resulting restrictions for future innovations. A start-up that is under the 500,000 Euro threshold is insentienced not to grow. Fulfilling the obligations of NetzDG can easily cost several hundred thousand Euros. Rising social media platforms from other EU countries have to be careful to not become too popular in Austria, because with more than 100,000 registered users in the past quarter they have to hire a team of legally trained people, no matter if they have made revenue in Austria or not. These types of restrictions in the Digital Single Market are why the European Commission wants to solve this problem on the EU level with the Digital Services Act, but Austria wanted to be first.
If this law is strong and restrictive enough for global corporations like Google and Facebook, it also threatens the existence of smaller, european competitors. The basic questions which platforms this law wants to regulation was unanswered throughout the negotiations. Therefore, the current draft is unbalanced and ineffective on several points.
Strict time limits and privatisation of law enforcement
The Austrian NetzDG applies to a catalogue of 15 criminal offenses, including hate speech, coercion, stalking and the degradation of religious teachings. Platforms must provide a reporting function for this illegal content and react immediately to notifications. If the content is obviously illegal for legal laypersons, it must be blocked within 24 hours after the notification. If the illegality is not that obvious, the platform can take a maximum of 7 days to respond.
Content that has been deleted, as well as information about the poster must be retained by the platform for 10 weeks for evidence and redress purposes. Law enforcement authorities can obtain this data or have the period extended by another 10 weeks, if necessary. There is no obligation to report blocked illegal content to the authorities.
The big improvement compared to the German NetzDG is that there are complaint and redress mechanisms to supervise the platform's decisions and increase the quality of their content moderation processes. That was also one of our first demands. If the platform deletes something that the poster believes was legal or if the person notifying content believes that illegal content has not been blocked, both parties can appeal this decision. In the first step this is a review procedure of the platform itself, which has to be concluded in two weeks. If this doesn’t lead a satisfactory outcome to either party involved they can bring this case in a second step to the arbitration body of the telecom and media regulator RTR. This body is structured similar to the internet ombudsman or the RTR arbitration when the internetspeeds were below the contractually guaranteed bandwidth. It remains to be seen whether this approach leads to fair decisions and better quality of content moderation decisions. Our biggest criticism at this point is that the ultimate decision on whether or not a piece of content was illegal no longer rests with a court.
Transparency: real improvements
The transparency obligations are very positive. The regulation foresees annual reports (quarterly reports for large platforms). In these, it should be explained how the content moderation process works exactly, how many cases of allegedly illegal content have been notified, how long they were checked and after how many hours a decision was made. In addition to statistics, the training of the content moderators and the technical systems used should also be described. Unfortunately it was neglected to specify which automated systems are used for content moderation and to oblige impact assessments. Facebook in particular is massively employing AI in this area, which we consider to be very problematic. The mistake of the German NetzDG was that content complained about as potentially unlawful would disappear from the transparency obligation if they were deleted on the basis of the terms and conditions. In Austria, such content reported as illegal would also have to appear in the statistics, even if it were then deleted on the basis of the terms and conditions. Finally, the competent regulatory authority KommAustria also has the possibility to issue guidelines to specify the reports. Our hope here would be a comparability of reports between platform operators. The participation of researchers and NGOs in the drafting of these guidelines would highly advisable, but is not obligatory!
From the user perspective, there are also transparency obligations. Both the person who complains about content and the person whose content was complained about receive information about the procedure, the basis for the decision and their possibilities to appeal. The reporting of content must be justified and the person who posted the content can at least submit a counter-notification in the following instances.
Penalties between petty cash and bankruptcy
We consider the penalty regulations of the law to be very problematic. If there are more than five complaint procedures at the RTR arbitration board concerning moderation decisions of the platform within a month, a procedure is immediately and automatically initiated at the regulatory authoritiy KommAustria. The authority should then decide on the basis of the frequency and type of complaint procedures whether the moderation procedure of the platform is defective. However, at this point in time, no court has ever dealt with the question of whether these specific contents were actually legal or illegal. Nevertheless, KommAustria has the option of issuing an administrative decision with specific requirements for the design of the platform's content moderation process or how to handle specific types of content. In the case of a second administative decision or if the first one is not implemented properly, KommAustria can issue penalties of up to 10 million euros.
This gives KommAustria enormous power to shape our online discourse. We fear parallels with the Hungarian media regulatory authority NMHH and hope that this new competence will be handled with the utmost care and responsibly. 10 million euros are petty cash for Google and Facebook. But in view of this risk, many smaller platforms will probably have to forego user-generated content entirely or police their users very strictly. Penalty thressholds calculated as a percentage of revenue would have been more proportionate. Two thirds of the budget of the regulator and the arbitration body will be paid by the online platforms it regulates. This financing of regulators and arbitration is not uncommon in Austrian telecom regulation and was one of the demands of our EU umbrella EDRi.
Authorized recipient and skimmed revenue
All platforms that are subject to the law must also name an authorized recipient. That must be a natural or legal person located in Austria or certain EU countries. This person needs certain discretionary power within the platform. The aim is to create an interface to the local legal system so that, for example, law enforcement authorities can turn to a locally available person to obtain user-data, or so that courts can deliver removal orders for illegal content. This person can also be fined personally if they are not "available at all times" for KommAustria, which can cost up to 10,000 euros. If the platform cannot be prosecuted, this person can also be fined up to 50,000 euros. We are curious to see who will end up doing this job…
KommAustria is able to independently initiate an examination of whether a platform falls under this law. However, if the platform disputes their assessment and does not believe it is affected by the Austrian NetzDG, it can lead to an ugly escalation. The turnover of this platform can be skimmed off by business partners based in Austria. For example, that would mean that if TikTok objects and does not recognize the obligations of this law. Then KommAustria could turn to marketing agencies in Austria who advertise on TikTok and collect the turnover that was intended for TikTok. There has to be an accounting system so that too much skimmed money can be paid back to the marketing agency, but this seems to be a quite drastic approach of a small country to ensure compliance by big internet companies. This provision has the potential to herald the end of the country-of-origin principle in the EU.
Conclusion
The Austrian NetzDG (KoPlG) attempts to solve a problem which most people would agree needs to be solved. But the path to this solution is clearly a political compromise between two very different approaches. Some aspects of the bill seem very sophisticated, others barely thought out. An exception for Wikipedia overlooks similar projects like WikiCommons or WikiData. Once again a law tries to repair the problems with the big internet companies and is so careless that it endangers the existence of the small, decentralized parts of the internet that could offer meaningful alternatives to the dominant players.
This draft legislation is now under national consultation till 15. October 2020. International participation would be highly recommended. The EU notification procedure will last up to three months. Not surprisingly the European Commission is not keen on this national attempt to influence the Digital Services Act (DSA), expected for December 2020. Once the DSA proposal is released the Commission would most likely block any national legislation in the same field. Given this tight timeline the Austrian law might be stopped procedurally if changes need to be made and a subsequent notification is not waived.
Since you're here
… we have a small favour to ask. You want to keep a close eye on the government? You want to stay up-to-date on surveillance, privacy, net neutrality, and all matters related to your fundamental rights on the internet? Subscribe to our newsletter and approximately once a month, we will send you a message (in German) about everything that happens around digital policy in Austria and in Europe, about our actions, legal analyses and position papers.
Together, we defend our fundamental rights in the digital age – because civil society works! Stay informed!