FORSCHUNG
Responsible Safe and Secure Robotic Systems Engineering (SEEROSE)
About
The Faculty of Technical Sciences and the Faculty of Humanities at the University of Klagenfurt (AAU) proposed an interdisciplinary Karl Popper Kolleg for the period from 2021-2024, organized by four professors (Security, Software Engineering, Psychology, and Ethics), which was approved by the AAU Rectorate in May 2020. Its title is “Responsible Safe and Secure Robotic Systems Engineering (SEEROSE)”.
Robot ethics demands programmers to write code that is not only functionally correct but also secure and safe to disallow any intended or accidental harm to humans. Hence, programmers bear a responsibility w.r.t. several instances (e.g., system customers, providers, end-users, etc.), for which awareness is required (likewise for questions of liability, which is a complex matter of contemporary research and legislation). SEEROSE aims at achieving usable robotic security by jointly addressing process ethical, psychological, and technical aspects of developing safe and secure robotics systems.
The following figure provides an overview of the four key areas and the main research questions addressed by SEEROSE.
SEEROSE features a Ph.D. project in each key area: The goal of the Ph.D. project DevSafe is to provide techniques and tools to support developers to responsibly develop and evolve safe and secure robotic systems. The goal of the Ph.D. project INBASE-GET is to provide mechanisms for incentivizing developers and robot collaborators to use and follow security precautions out of their own interests. The goal of the Ph.D. project SCoRE is to provide an instrument for the psychological assessment of the core qualifications relevant to robotics engineers. And finally, the goal of the Ph.D. project CERSE is to provide a guideline for the implementation of a process-ethical procedure for distributed assumption of responsibility in safe and secure robotic systems engineering.
The goals of SEEROSE are well aligned with the demands of the recent Vienna Manifesto on Digital Humanism. This initiative calls for “Practitioners everywhere ought to acknowledge their shared responsibility for the impact of information technologies”, and “A vision is needed for new educational curricula, combining knowledge from the humanities, the social sciences, and engineering studies”, and “Students should learn to combine information-technology skills with awareness of the ethical and societal issues at stake.” SEEROSE addresses these demands directly.
Members of the SEEROSE Kolleg
Core Group

PHD Students

Project: Collective Ethical Responsibility for Robotic Systems Engineering with Security & Safety (CERSE)
Robotic systems increasingly take part in many practices within everyday life. Technological development and innovation transform fields like industrial robotics, medical technology, up to the exploration of space. However, new sets of possibilities come with new forms of responsibilities.
Engineering itself is a process that is (per)formed by many; individuals, teams, systems, norms, cultures and legislations, only to name a few. This PhD project engages with the ethical challenges that arise within safe and secure robotic systems engineering. It investigates the subjectively felt responsibility and explores the perception, governance and distribution of the networking processes of many hybrid actors in multiple heterogeneous fields. A mixed methods approach within the research design of Grounded Theory and Actor-Network Theory enables to identify and follow how responsibilities are organized and shared. The research aims to discuss which new ethical questions emerge and what competencies and strategies of safe and secure robotics engineering are required.
Project: Security Conscious Robotics Engineering (SCoRE)
Software development is a highly complex task demanding multiple competencies from engineers in order to provide functional, safe, and secure systems. Customers’ sophisticated requirements combined with software engineers’ notoriously limited resources (e.g. time pressure) may result in systems falling short in safety and security. The subproject SCoRE addresses this issue by pursuing a twofold target: On the organisational level, we identify factors fostering or hindering safe and secure development, and on the individual level, we identify robotic system engineers’ abilities, personality traits, and/or attitudes facilitating the development of safe and secure robotic systems. Thereupon, we develop standardized assessments of both organisational and individual factors. This instrument will allow for detecting structural deficits and evaluating engineers’ needs, thus forming the ground for providing adequate support and foster personal improvement of robotic systems engineers.
Project: Developing and Evolving Safe and Secure Robotic Systems (DevSafe)
Robotic systems are among the most complex systems that humans built. In fact, they consist of distributed multiple hardware and software components that depend on each other and often such components represent complex systems or subsystems themselves. Maintaining and evolving robotic systems is challenging and each modification poses the risk to introduce vulnerabilities in the implementation or configuration of the robotic system that allow others to attack the robotic system. In this PhD project, we will design techniques and tools to extract detailed information about code changes in robotic systems with a focus on changes that introduce security vulnerabilities. Based on this information, we will investigate algorithms and techniques to analyze and determine the impact of code changes on the safety and security of robotic systems. They will be integrated into recommender systems that guide engineers to detect and fix vulnerabilities, and help them develop safe and secure robotics systems in a responsible way.
Project: Incentive-Based Security Engineering using Game Theory (INBAES-GET )
Safety requires Security in Robotic systems. Nontheless the latter is strongly neglected in robotic software engineering. Security is a costly, yet non-observable and non-functional part of software: it does not generate revenue – it „only“ prevents harm. Lacking economic incentives, security is neglected along the robotic systems supply chain. This caused a social dilemma, where the last link of the supply chain is left to secure most parts of system – despite limited knowledge on the components.
The key research questions are: How can the integration of (re)liable security be incentivized along the whole robotic supply chain to obtain a (socially) efficient outcome? And on a microscopic level: which mechanisms for developer teams install individual responsibility and (re)liability at a given point within the supply chain? The methodology to tackle these problems is the development of suitable (game theoretic) incentive mechanisms.
Scientific Advisory Board
- Bram Adams, Polytechnique Montreal, Canada
- Tansu Alpcan, University of Melbourne, Australia
- Tamer Başar, University of Illinois, U.S.A.
- Ulrich Berger, Vienna University of Economics and Business, Austria
- Massimiliano Di Penta, University of Sannio, Italy
- Bernhard Dieber, Joanneum Research ROBOTICS, Austria
- Lisa-Marie Faller, Kärnten University of Applied Science
- Harald Gall, University of Zurich, Switzerland
- Endika Gil-Uriarte, Alias Robotics, Spain
- Armin Grunwald, Karlsruhe Institute of Technology, ITAS, Germany
- Jessica Heesen, Universität Tübingen, Germany
- Dietmar Jannach, Universität Klagenfurt, Austria
- Mathias Karmasin, Universität Klagenfurt, Austria
- Muhammad Taimoor Khan, University of Greenwich, UK
- Mateusz Maciaś, Industrial Research Institute for Automation and Measurements PIAP, Poland
- Manos Panaousis, University of Greenwich, UK
- Matthias Rath, Ludwigsburg Univ. of Education, Germany
- Peter Schartner, Universität Klagenfurt, Austria
- Paul Schweinzer, Universität Klagenfurt, Austria
- Further national and international collaboration partners t.b.a.
Expert Talks
24.2.2021: Dr. Bernhard Dieber, Joanneum Research, Institute for Robotics and Mechatronics, Research Group Research Group Robot Systems Technologies
22.4.2021: Prof. Dr. Lisa-Marie Faller, Professor for Robotics, FH Kärnten / University of Applied Sciences / Engineering & IT
Quicklinks

Informationen für
Adresse
Universitätsstraße 65-67
9020 Klagenfurt am Wörthersee
Austria
+43 463 2700
uni [at] aau [dot] at
www.aau.at
Campus Plan