Together with our industry partner Fujitsu, we, the TUM Institute for Ethics in Artificial Intelligence and Chair of Automotive Technology, held our workshop “Accountability Requirements for AI Applications”  in March 2022 to discuss current issues of responsibility and explainability in the context of AI, with a focus on risks recognition and management.

What we did:
The aim of this online event was to mutually share experiences on today’s problems and issues for AI accountability and safety in practice. We discussed (in expert groups) case studies from different sectors, including healthcare, finance, and autonomous driving, to identify practical risks linked to AI-based systems. The overall goal was to identify pressing challenges of the distribution of accountability obligations amongst the different actors and discuss the various approaches to solve them.

About us:
The workshop was organized by two institutions from TUM, forming an interdisciplinary team with social and technical expertise. The Chair of Automotive Technology (add hyperlink) deals with cutting edge technology for the future of vehicles and mobility systems. The Institute for Ethics in Artificial Intelligence addresses the question of how AI can meet societal needs with an ethical mindset. Together with our industry partner, Fujitsu, we have set ourselves the goal of creating an accountability framework defining the AI’s requirements for their approval on the European and global market.

The outcome of this workshop is publicly available and can be found here.