Human-Computer Interaction and Interactive Technologies

Leader of the group: Prof. Dr. Jürgen Steimle

- Group's website -



Vision and Research Strategy

The research group investigates fundamental research questions that lie on the boundary between Human-Computer Interaction and Ubiquitous Computing. Central aspects of Mark Weiser’s vision of Ubiquitous Computing have become reality. However, the rigid and rectangular nature of today’s user interfaces is limiting in several ways: it restricts not only the embedding of user interfaces in the physical environment, but also mobile use, interaction, and customization. Our research focuses on future forms of interfaces, which are deformable, elastic, and support multi-modal interactions. We strive to advance the state-of-the-art by 1) developing new sensor and display surfaces, by 2) developing and empirically assessing novel interaction techniques, and by 3) contributing new methods for easy, fast and inexpensive fabrication of such interfaces.



Research Areas and Achievements

Digital Fabrication of Customized Interactive Surfaces

Conventional touch sensors and displays are mass-produced and quite restricted in their shape; typically they are rectangular, planar, and rigid. This limits the objects and locations where sensors and displays can be deployed. In our recent research, we have developed new approaches for the digital fabrication of touch sensors and active displays of custom shape. These involve printed thin-film electronics and electronic components that are embedded inside 3D printed objects.

With Foldio (ACM UIST 2015) we have contributed an automated design and fabrication pipeline for printed thin-film sensors and displays. Addressing the intricate complexity of manually designing electronic components, our concepts enable designers and laypeople to realize functional designs in a high-level graphical editor. Using a set of new design patterns and a parameterization approach for modeling of circuits, the pipeline automatically generates a printable layout.

Moving beyond thin-film surfaces, we have investigated 3D-printed electronics. Capricate (UIST 2015) contributes a fabrication method for 3D printed capacitive touch sensors. Embedded within a 3D printed object, these can have a custom shape and are printed in a single pass along with the object. In more recent work (CHI 2017), we contribute principles and printable geometries for sensing the deformation of 3D printed objects using embedded conductive structures. With HotFlex (CHI 2016), we have presented an approach for computer-controlled, localized customization of a 3D printed object. Due to embedded conductive structures of customized geometries, the 3D object can be customized after it is printed, during use time.


On-skin Interaction

Today’s body worn devices, such as smartwatches or head-mounted displays, are characterized by a very small input surface. This makes interaction very challenging. We are studying how human skin can be used as a complimentary input and output surface for subtle, direct, and versatile control of wearable devices.

In the reporting period, we have extended upon our pioneering work on iSkin sensors that are worn on the skin. A new fabrication pipeline allowed us to significantly slim down the thickness of sensors to 5–50 microns, now allowing for close conformality even to fine wrinkles. These technical advances enabled us to investigate SkinMarks (CHI 2017), conformal skin-worn electronics that capture body-based input and offer visual output on challenging locations on the body. This includes interaction on skeletal landmarks, such us curved knuckles, on skin microstructure, such as fine wrinkles, and on highly elastic skin.