Reconsidering Artificial Intelligence

forschung leben – the magazine of the University of Stuttgart (Issue March 2021)

Researchers at the University of Stuttgart’s new Interchange Forum for Reflecting on Intelligent Systems (IRIS) are thinking about how intelligent systems are affecting society.

Can an inanimate object be racist? Some time ago a video of a soap dispenser systematically denying soap to dark-skinned people went viral on social media. The soap dispenser reacted this way because standard infrared technology was developed by fair-skinned people and tested exclusively on their hands. This is not an isolated case. Another example occurred at the Südkreuz train station in Berlin, where a facial recognition software pilot project involving 300 test subjects was carried out in 2017. The study results showed that the system was flagging too many people as suspects that it was not even looking for. Once again, people of color and women were particularly affected.

Can technological discrimination be blamed on algorithms? According to Jun.-Prof. Maria Wirzberger of the “Department of Teaching and Learning with Intelligent Systems” (LLiS) and spokesperson for the new Interchange Forum for Reflecting on Intelligent Systems (IRIS) research group: “The answer is no because, in simple terms, algorithms work like drawers. they are based on standards which were set by human beings as a result of which they may well reflect their stereotypes. These stereotypes are often unconscious and are therefore incorporated unconsciously into technological developments.” 

Anti unfairness

Researchers from all disciplines now wish to collaborate at the IRIS to investigate such developments. “Over the past few years we’ve learned how the naive use of Artificial Intelligence for automated decision-making can lead to unfair discrimination,” says Prof. Steffen Staab of the University of Stuttgart’s Institute for Parallel and Distributed Systems and co-spokesperson of the IRIS: “Which is why we are developing new methods to avoid, detect and explain unfairness.” Through the IRIS network, researchers intend to critically reflect on the foundations, mechanisms, implications and effects of intelligent systems in research and teaching as well as with regard to society as a whole. The IRIS is funded by the German Research Foundation (DFG) as part of the German federal and state governments’s Excellence Strategy as well as by the University’s research fund.

„Human decisions are often based on unconscious prejudices.“

Prof. Steffen Staab

The IRIS’s tasks and course offerings are not limited to the field of research, but also create interchange opportunities both within and outside the University to discuss current ethical and social challenges with partners from public society and the business sector ranging from data ethics to informational self-determination and reliable AI. The focus is also on teaching: the “Reflecting on Intelligent Systems in the Next Generation” (RISING) teaching forum, which is headed up by Maria Wirzberger, teaches students of all subjects how to critically reflect on intelligent systems by offering courses on such things as “cultural bias” and “open science”. Teachers can further their own training by applying reflective teaching methods.

Raising awareness

So how does IRIS help to work against stereotyping in such fields as technology development? “What’s so good about IRIS,” Wirzberger explains, “is that it raises student awareness of this issue before they enter the workplace, which will prevent such developments on this scale.” Even in terms of language, because, as Wirzberger explains: “language is also an intelligent system. “Anyone who fails to use language in a sensitive manner may exclude entire deomgraphic groups. Topics such as these should be firmly anchored in the way we think and act. We wish to create an awareness of just how colorful and diverse our society is.” On the other hand, the intention is to boost the University's networking activities at the international level. The benefit of bringing different people together, the researcher explains, is that good ideas emerge and there is a lively exchange of ideas.

Steffen Staab emphasizes the fact that people also learn something about themselves in the process: “Human decisions are often based on unconscious prejudices. Digitizing our decisions, makes the results transparent and verifiable. Increasingly, we are now seeing how we have unfairly discriminated against others in the past. In future, AI will hopefully enable us to better scrutinize our own decisions and make fairer judgments.” Reflecting upon intelligent systems is already firmly anchored in several areas within the University of Stuttgart, such as the “Platform of Reflection” within the “Stuttgart Center for Simulation Science” (SimTech) Cluster of Excellence, in one of the thematic foci of the International Center for Cultural and Technological Studies (IZKT) and in the Center for Interdisciplinary Risk and Innovation Studies (ZIRIUS). The IRIS addresses all disciplines, from technology and engineering to the humanities, social sciences and economics, and unites all the various competencies.

Text: Carina Lindig

Jun.-Prof. Dr. Maria Wirzberger, e-mail, phone: +49 711 685 81176

Prof. Dr.  Steffen Staab, e-mail, phone: +49 711 685 88100

Contact

 

University Communications

Keplerstraße 7, 70174 Stuttgart

To the top of the page