Almost 50 civil society organizations and academics have asked the EU Commission to include binding transparency requirements for online platforms in their draft Digital Services Act (DSA). The planned law for digital services should therefore “introduce comprehensive framework conditions for data access”, which pave the way for “real accountability” and a civil society “digital 4th violence” in addition to the media.
“Improved transparency towards users is required and can provide urgently needed insights into the personalized results” that Facebook, Twitter & Co. present to the individual members, it says in the input published on Friday to the Commission. The release of pseudonymized or anonymized data for research should also give insight “into the collective influence of platforms”.
Look into the “black box”
In order to be able to assess and control how platforms apply their community standards or tackle collective social risks such as disinformation, polarization and bias, evident facts from independent bodies are necessary, according to the alliance. For journalists, scientists and civil society actors who want to understand opaque algorithmic “black boxes”, however, it is difficult to get the necessary information from corporations.
“Independent researchers are facing enormous challenges in accessing reliable data from platforms,” complains the alliance, which, in addition to the initiators AlgorithmWatch and the European Policy Center, is committed to organizations such as Access Now, European Digital Rights (EDRi), HateAid and the Neue Stiftung Have attached responsibility. In recent years, the operators have even further restricted access to their public programming interfaces (APIs). It is almost impossible to “hold them accountable for illegal or unethical behavior”.
At this point, self-regulation is “incomplete, ineffective, unmethodical and unreliable,” the parties call after the legislature. The concentration of data in the hands of a few companies has a strong impact on the general well-being of “the digital public sphere”. In the interests of a strong democracy, it must be possible to independently check systems of automated decision-making and the algorithms they use.
Dominant platforms and gatekeepers
According to the declaration, disclosure obligations should differentiate between market-dominant actors and smaller intermediary bodies, which could be defined on the basis of factors such as annual sales, market share and user base. It is advisable to limit the scope to “dominant platforms” and the “gatekeepers” already envisaged by the Commission.
The transparency requirements should be based on the technical functionality of the service and not on “ambiguous and politically charged” terms such as “political advertising” or “hate speech”, emphasize the signatories. The technical features could include aggregated user numbers at an abstract level, advertising and microtargeting, search functions, feeds, ranking, recommendations and content-related moderation factors. Deletion specifications and other measures such as fact checking should also be included.
The alliance writes that the be-all and end-all of a corresponding rule is an EU institution with a clear legal mandate to enable access to data and enforce transparency requirements across the EU. Clear provisions are also important in order to keep data collection in line with existing laws protecting the privacy of those concerned.
Hatred, terror, murder
The EU justice ministers also announced at their informal virtual meeting on Friday that they wanted to participate in the discussions on the DSA as soon as possible. “Hate crime, terror propaganda and appeals for murder must be pursued more decisively and at an early stage,” emphasized Federal Justice Minister Christine Lambrecht on behalf of the German Council Presidency.
“We need clear obligations from the online platforms. YouTube, Facebook & Co. are responsible not to allow themselves to be misused as hate speech platforms any longer,” demanded the SPD politician. The breeding ground for acts like the beheading of the French teacher Samuel Patys are “waves of hatred on the Internet”. Meanwhile, Google wants to use targeted lobbying to prevent the platform law from being strict and undermining popular services among users.