We are in the process of curating a list of this year’s publications — including links to social media, lab websites, and supplemental material. Currently, we have 68 full papers, 23 LBWs, three Journal papers, one alt.chi paper, two SIG, two Case Studies, one Interactivity, one Student Game Competition, and we lead three workshops. 1 paper received a best paper award and 13 papers received an honorable mention.
Disclaimer: This list is not complete yet; the DOIs might not be working yet.
Your publication from 2025 is missing? Please enter the details in this Google Forms and send us an email that you added a publication: contact@germanhci.de
Motion-Coupled Asymmetric Vibration for Pseudo Force Rendering in Virtual Reality
Nihar Sabnis (Max Planck Institute for Informatics), Maelle Roche (Max Planck Institute for Informatics), Dennis Wittchen (Max Planck Institute for Informatics), Donald Degraen (University of Duisburg-Essen), Paul Strohmeier (Max Planck Institute for Informatics).
Honorable MentionAbstract | Tags: Full Paper, Honorable Mention, Output Modalities & Haptics | Links:
@inproceedings{Sabnis2025MotioncoupledAsymmetric,
title = {Motion-Coupled Asymmetric Vibration for Pseudo Force Rendering in Virtual Reality},
author = {Nihar Sabnis (Max Planck Institute for Informatics), Maelle Roche (Max Planck Institute for Informatics), Dennis Wittchen (Max Planck Institute for Informatics), Donald Degraen (University of Duisburg-Essen), Paul Strohmeier (Max Planck Institute for Informatics).},
url = {https://sensint.mpi-inf.mpg.de/index.html, website
https://de.linkedin.com/in/nihar-sabnis, author\'s linkedin},
doi = {10.1145/3706598.3713358},
year = {2025},
date = {2025-04-26},
urldate = {2025-04-26},
abstract = {In Virtual Reality (VR), rendering realistic forces is crucial for immersion, but traditional vibrotactile feedback fails to convey force sensations effectively. Studies of asymmetric vibrations that elicit pseudo forces show promise but are inherently tied to unwanted vibrations, reducing realism. Leveraging sensory attenuation to reduce the perceived intensity of self-generated vibrations during user movement, we present a novel algorithm that couples asymmetric vibrations with user motion, which mimics self-generated sensations. Our psychophysics study with 12 participants shows that motion-coupled asymmetric vibration attenuates the experience of vibration (equivalent to a ~30% reduction in vibration-amplitude) while preserving the experience of force, compared to continuous asymmetric vibrations (state-of-the-art). We demonstrate the effectiveness of our approach in VR through three scenarios: shooting arrows, lifting weights, and simulating haptic magnets. Results revealed that participants preferred forces elicited by motion-coupled asymmetric vibration for tasks like shooting arrows and lifting weights. This research highlights the potential of motion-coupled asymmetric vibrations, offers new insights into sensory attenuation, and advances force rendering in VR.},
keywords = {Full Paper, Honorable Mention, Output Modalities & Haptics},
pubstate = {published},
tppubtype = {inproceedings}
}
Spatial Haptics: A Sensory Substitution Method for Distal Object Detection Using Tactile Cues
Iddo Yehoshua Wald (University of Bremen), Donald Degraen (University of Canterbury, University of Duisburg-Essen), Amber Maimon (The University of Haifa, Ben Gurion University), Jonas Keppel (University of Duisburg-Essen), Stefan Schneegass (University of Duisburg-Essen), Rainer Malaka (University of Bremen)
Abstract | Tags: Full Paper, Output Modalities & Haptics | Links:
@inproceedings{Wald2025SpatialHaptics,
title = {Spatial Haptics: A Sensory Substitution Method for Distal Object Detection Using Tactile Cues},
author = {Iddo Yehoshua Wald (University of Bremen), Donald Degraen (University of Canterbury, University of Duisburg-Essen), Amber Maimon (The University of Haifa, Ben Gurion University), Jonas Keppel (University of Duisburg-Essen), Stefan Schneegass (University of Duisburg-Essen), Rainer Malaka (University of Bremen)},
url = {https://www.uni-bremen.de/dmlab, website
https://hci.informatik.uni-due.de/, website
https://youtu.be/1hMRs79zlgQ, teaser video
https://linkedin.com/company/dml-bremen, research group linkedin
https://de.linkedin.com/company/hci-group-essen, research group linkedin
https://linkedin.com/in/iddo-wald, author's linkedin
https://www.facebook.com/HCIEssen, facebook
https://x.com/hci_due, social media, X},
doi = {10.1145/3706598.3714083},
year = {2025},
date = {2025-04-26},
urldate = {2025-04-26},
abstract = {We present a sensory substitution-based method for representing locations of remote objects in 3D space via haptics. By imitating auditory localization processes, we enable vibrotactile localization abilities similar to those of some spiders, elephants, and other species. We evaluated this concept in virtual reality by modulating the vibration amplitude of two controllers depending on relative locations to a target. We developed two implementations applying this method using either ear or hand locations. A proof-of-concept study assessed localization performance and user experience, achieving under 30° differentiation between horizontal targets with no prior training. This unique approach enables localization by using only two actuators, requires low computational power, and could potentially assist users in gaining spatial awareness in challenging environments. We compare the implementations and discuss the use of hands as ears in motion, a novel technique not previously explored in the sensory substitution literature.},
keywords = {Full Paper, Output Modalities & Haptics},
pubstate = {published},
tppubtype = {inproceedings}
}
TactStyle: Generating Tactile Textures with Generative AI for Digital Fabrication
Faraz Faruqi (MIT CSAIL), Maxine Perroni-Scharf (MIT CSAIL), Jaskaran Singh Walia (Vellore Institute of Technology), Yunyi Zhu (MIT CSAIL), Shuyue Feng (Zhejiang University), Donald Degraen (HIT Lab NZ, University of Canterbury & University of Duisburg-Essen), Stefanie Mueller (MIT CSAIL)
Abstract | Tags: Full Paper, Output Modalities & Haptics | Links:
@inproceedings{Faruqi2025Tactstyle,
title = {TactStyle: Generating Tactile Textures with Generative AI for Digital Fabrication},
author = {Faraz Faruqi (MIT CSAIL), Maxine Perroni-Scharf (MIT CSAIL), Jaskaran Singh Walia (Vellore Institute of Technology), Yunyi Zhu (MIT CSAIL), Shuyue Feng (Zhejiang University), Donald Degraen (HIT Lab NZ, University of Canterbury & University of Duisburg-Essen), Stefanie Mueller (MIT CSAIL)},
url = {https://hcigroup.de, website
https://youtu.be/Ax57j-voj7k, teaser video
https://linkedin.com/company/hci-group-essen, lab\'s linkedin
https://www.linkedin.com/in/donald-degraen-54ba0010/, author\'s linkedin
https://x.com/hci_due, x
https://www.facebook.com/HCIEssen, facebook},
doi = {10.1145/3706598.3713740},
year = {2025},
date = {2025-04-26},
urldate = {2025-04-26},
abstract = {Recent work in Generative AI enables the stylization of 3D models based on image prompts. However, these methods do not incorporate tactile information, leading to designs that lack the expected tactile properties. We present TactStyle, a system that allows creators to stylize 3D models with images while incorporating the expected tactile properties. TactStyle accomplishes this using a modified image-generation model fine-tuned to generate heightfields for given surface textures. By optimizing 3D model surfaces to embody a generated texture, TactStyle creates models that match the desired style and replicate the tactile experience. We utilize a large-scale dataset of textures to train our texture generation model. In a psychophysical experiment, we evaluate the tactile qualities of a set of 3D-printed original textures and TactStyle's generated textures. Our results show that TactStyle successfully generates a wide range of tactile features from a single image input, enabling a novel approach to haptic design.},
keywords = {Full Paper, Output Modalities & Haptics},
pubstate = {published},
tppubtype = {inproceedings}
}