Bianca Herlo, design researcher with a focus on participatory design at the Weizenbaum Institute in Berlin, calls for more diversity in teams of programmers. Developers often represent “very homogeneous groups”, explained the scientist on Wednesday at the conference Computer science 2021. Particularly when designing algorithmic systems, it is important that historical data is often used that reproduces discrimination even more strongly.
The tech community must be set up in such a way that it can more easily support social transformation processes. At the same time, researchers should better demonstrate the implications of digital technology instead of spreading too much euphoria. Others cultivated “dystopian ideas” about dangers, explosive moments and power asymmetries that excluded entire sections of society.
Malicious design on cookie banners
In the case of some applications and web functions such as cookie banners, the expert in the panel on socially sustainable IT on the Annual meeting of the Society for Computer Science (GI) even “malicious design”. On the other hand, standards for user- and data protection-friendly programming could help. Furthermore, better interaction between information, digital competence and regulation is crucial in order to create the infrastructure for digital maturity.
Cookie banners are often not only “malicious” and designed to ignore user interests, but also make it difficult to measure the “user experience”, explained Wolfram Wingerath from the cloud service provider Baqend. This company tries to minimize the loading times of websites with new caching algorithms. Even “two seconds of annoying click” led to falsifications. In general, a lot of research is still needed to build a common understanding of “what fast internet actually means”. To this end, Wingerath is building the speedhub.org platform with the University of Hamburg and Google.
Sensitive health apps
For Marie Kochsiek from the Heart of Association, dating apps are a proven example of strong “implicit influence”. Participants have to enter data about their own body and sexual orientation. Other users are then displayed based on this. This means that “I won’t be able to get to know a lot of people at all” because the filter is primarily based on experiences with previous users.
Kochsiek is also part of the development team for the Drip cycle app, which users can use to save period data locally and in encrypted form. Working on it made it clear to her how relevant ethical issues are for programmers. The project is about intimate questions about the frequency of sex, the state of health and the use of contraceptives. Passing such data on to advertisers would probably not be in the interest of most users.
Automated employee analysis has a bad reputation
“We involve those affected in the design right from the start,” assured Matthias Peissner, head of the human-technology interaction research department at the Fraunhofer Institute for Industrial Engineering and Organization (IAO). Many employees currently have the feeling that digitization and performance evaluation processes are promoting profit maximization. However, assistance systems could help ensure that fewer mistakes occur and that people with lower qualifications can take on more demanding jobs.
Peissner criticized the fact that in this country software for personnel analysis generally does not have a good reputation. In the digital workspace, it is often a question of not collecting any personal data. Relevant programs could also support the interests of employees and the optimization of production processes. For this purpose, information about employees such as their interests, their need for aids and their current state of health would have to be “responsibly” collected and evaluated.
So far there have only been “rough ideas” in this area. The scientist knows that the data can only be collected on personal devices. The “really hot iron” requires “substantial research” into what such models could look like. In general, more transparency is needed around “deals with companies” whose business model is based on the analysis of personal data. Everyone must be aware of the benefits they are receiving and what they are revealing for them.