We are responsible for the courses offered in the Interactive Systems specialization branch of the Applied Informatics Bachelor’s program and in the Human-Computer Interaction specialization branch of the Informatics Master’s program. We also teach and supervise PhD students within the Doctoral Program Informatics.
Please see the lists below for more information about our current teaching activities and for details about the Bachelor’s projects that we currently offer. For more details about our specialization Human-Computer Interaction and for information about open Master’s thesis projects please visit our Master’s specialization page.
If you are interested in starting a PhD-thesis on a topic related to human-computer interaction and interactive systems, contact Martin Hitz, Gerhard Leitner or David Ahlström and let us know a bit about your background and research interests so that we can discuss the possibilities for a cooperation!
Arduino is a popular open source hardware and software platform that can be used to ‘digitalize’ physical objects. In this project we will use an Arduino microcontroller and stretch sensors to explore ways to create elastic objects that can be used as computer input devices. In the first project phase you will get familiar with the Arduino Software (IDE) and basic hardware (sensors and controller boards). The work in the second project phase will focus on reading data from a stretch sensor and preparing the data for further use on an Android-smartphone or smartwatch. This project can also be extended into a Master’s thesis project.
The desired outcome of this project is a software package that provides the functionality that is necessary to conduct user studies on readability. This includes functionality to configure different test cases (defining combinations of various text attributes such as letter size, color, and typeface) and functionality to present the different test cases to study participants, as well as functionality to log the time participants need to read the text in the different test cases.
Today’s smartwatches are equipped with high-quality sensors, such as a gyroscope and an accelerometer. These sensors could open up new possibilities to extend the user’s input vocabulary beyond the smartwatch’s small touchscreen. This project aims at interpreting the sensor signals that are generated when the user gestures in the air or swipes a finger (on the hand where the watch is worn) across an uneven surface and to map signal patterns to input commands.