INFOMMMI (Multimodal Interaction)
3rd quarter, 2025/2026, timeslot D, 7.5 ECTS
Lectures 1-4 will be given by Peter Werkhoven and address human visual, auditory and tactile perception and the use of its potential in designing novel interfaces for interacting with virtual worlds. Abdallah "Abdo" El Ali will address Augmented Reality and related applications in lectures 5-7.
topics
The lectures will address the following topics:
- some basics of visual, auditory and tactile perception and the effects of combining them (multimodal presentation and related interface design guidelines)
- visual communication interfaces in virtual worlds (effects of non-verbal facial communication)
- navigation interfaces in virtual worlds (head tracked visualization and 'cyber sickness')
- manipulation in virtual worlds (traditional mouse-cursor interfaces versus virtual hand control)
- virtual worlds through mobile displays (scrolling interfaces versus virtual windows)
- emerging interface technology (synaesthetic media and brain machine interfaces)
- augmented reality (technologies, human factors, interaction)
Slides from the lectures
Introduction (A. El Ali)
-
Introduction slides (PDF)
- Video 1: National Geographics - Brain Games - The rubber hand illusion
- Video 2: Multimodal (visual & tactile) VR experiments
Lecture 2: Hearing, touch, multimodal (P. Werkhoven)
Lecture 3: VR technology, human factors (P. Werkhoven)
Lecture 4: Emotions, BCI & cyborgs (P. Werkhoven)
- To be uploaded.
-
To be uploaded.
-
To be uploaded.
-
To be uploaded.
-
Q&A session slides to be uploaded.
-
Paper 1:
Ernst, M.O. & Bulthoff, H.H. (2004).
Merging the senses into a robust percept.
Trends in Cognitive Sciences, 8(4), 162-169.
PDF (password protected) -
Paper 2:
Petkova, V.I., & Ehrsson, H.H. (2008).
If I Were You: Perceptual Illusion of Body Swapping.
PLoS ONE 3(12): e3832. doi:10.1371/journal.pone.0003832
PDF (password protected) -
Paper 3:
Wolbers, T., Hegarty, M. (2010).
What determines our navigational abilities?
Trends in Cognitive Sciences, 14(3), 138-146
PDF (password protected) - Paper 4: Toet et al. (2022). Towards a multiscale QoE assessment of mediated social communication. Qual User Exp 7, 4.
-
Paper 5:
Veelen, N. van et al. (2021). Tailored Immersion: Implementing Personalized Components Into Virtual Reality for Veterans With Post-Traumatic Stress Disorder. Frontiers in Virtual Reality, 2 (art 740795).
- Paper 6: Lee, V.K., Nau A.C., Laymon, C., Chan K.C., Rosario B.L., Fisher C. (2014). Successful tactile based visual sensory substitution use functions independently of visual pathway integrity. Frontiers in Human Neuroscience, 8, 291.
- Paper 7: R. Skarbez et al. (2021). Revisiting Milgram and Kushino's Reality-Virtuality Continuum, Frontiers in Virtual Reality, Vol. 2, 2021. PDF (password protected).
- Paper 8: J. Bailenson et al. (2024). Seeing the World through Digital Pisms: Psychological Implications of Passthrough video Usage in Mixed Reality, Journal of Technology, Mind, and Behavior PDF (password protected). See also the accompanying video.
- Paper 9: J. Lee et al. (2022). User Preference for Navigation Instructions in Mixed Reality, IEEE VR 2022 conference. PDF (password protected). See also the accompanying video.
- Exam 2024 (part 2)
- Exam 2023 (part 2)
- Exam 2022 (part 2)
- Exam 2021 (part 2)
- Exam 2020 (part 2)
- Exam 2019 (part 2)
- Exam 2018 (part 2)
- Exam 2017 (part 2)
- Exam 2016 (part 2)
papers to read
Multimodal perception: (*)
exams from previous years (part 2, lectures by W. Hürst)
The exam will be a closed book digital exam. For part 1 (lectures 1-4 by Peter Werkhoven), please refer to the example questions that Peter showed in his first lecture. For part 2 (lectures 5-7 by Abdallah El Ali), you find the exam questions of previous years below.
Important: Be aware that the ones from 2020 and 2021 were open book exams (due to COVID, they were done from home). Also, different aspects have been covered in the lectures of previous years. Finally, before 2021, parts 1 and 2 were weighted 60-40. Since 2021, it is 50-50.