Contact
Subject
I am a PhD candidate and scientific employee at Stuttgart University's AC department, working under the supervision of Prof. Dr. Steffen Staab. My research focuses on human-computer interaction, particularly multimodal interaction combining eye tracking with touch or non-lexical voice input.
Currently, I am working on the EXIST-funded project Semanux, which aims to make the digital world more inclusive by enabling people with disabilities to control the computer via their individual capabilities.
I have published papers on novel methods of eye typing at ACM CHI and ACM ETRA. In addition, I tutor courses in Human-Computer Interaction, Information Retrieval, and Machine Learning at the University of Stuttgart and supervise student theses. Prior to pursuing my PhD, I gained seven years of industry experience working for companies such as Bliksund in Norway and Union Betriebs-GmbH in Bonn, where I contributed to various web-oriented IT projects, including a rules repository system for CDU and the personal homepage of Angela Merkel.
- Hedeshy, R., Menges, R., & Staab, S. (2023). CNVVE: Dataset and Benchmark for Classifying Non-verbal Voice Expressions. Interspeech 2023, August 20--24, 2023. Dublin, Irland.
- Hedeshy, R., Kumar, C., Lauer, M., & Steffen, Staab. (2022). All Birds Must Fly: The Experience of Multimodal Hands-free Gaming with Gaze and Nonverbal Voice Synchronization. INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION (ICMI ’22), November 7--11, 2022, Bengaluru, India. https://doi.org/10.1145/3536221.3556593
- Hedeshy, R., Kumar, C., Menges, R., & Staab, S. (2021). Hummer: Text Entry by Gaze and Hum. CHI Conference on Human Factors in Computing Systems (CHI ’21), May 8--13, 2021, Yokohama, Japan. https://doi.org/10.1145/3411764.3445501
- Hedeshy, R., Kumar, C., Menges, R., & Staab, S. (2020). GIUPlayer: A Gaze Immersive YouTube Player Enabling Eye Control and Attention Analysis. ETRA ’20 Adjunct: 2020 Symposium on Eye Tracking Research and Applications, Stuttgart, Germany, June 2-5, 2020, Adjunct Volume, 1:1--1:3. https://doi.org/10.1145/3379157.3391984
- Kumar, C., Hedeshy, R., MacKenzie, I. S., & Staab, S. (2020). TAGSwipe: Touch Assisted Gaze Swipe for Text Entry. CHI ’20: CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, April 25-30, 2020, 1--12. https://doi.org/10.1145/3313831.3376317
- HCIIR SS2021
- Machine learning Tutorial SS2020
- Semanux
Semanux is developing technologies that make it possible to operate a computer via a combination of various input means, mostly eliminating the need for a mouse and a keyboard. More info at www.semanux.com - MICME
The MICME project aims to combine different technologies from gesture recognition, eye tracking, voice control, and AR/VR technology into a system that can be used in the operating room.