Journal on Multimodal User Interfaces

Papers
(The TQCC of Journal on Multimodal User Interfaces is 4. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2021-11-01 to 2025-11-01.)
ArticleCitations
Human or robot? Exploring different avatar appearances to increase perceived security in shared automated vehicles66
Gesture-based guidance for navigation in virtual environments36
Combining haptics and inertial motion capture to enhance remote control of a dual-arm robot19
Vis-Assist: computer vision and haptic feedback-based wearable assistive device for visually impaired18
An overview study on the use of Semantics in Immersive Environments16
A low duration vibro-tactile representation of Braille characters14
The Audio-Corsi: an acoustic virtual reality-based technological solution for evaluating audio-spatial memory abilities10
TapCAPTCHA: non-visual CAPTCHA on touchscreens for visually impaired people10
Prediction of pedestrian crossing behaviour at unsignalized intersections using machine learning algorithms: analysis and comparison8
Assessment of comparative evaluation techniques for signing agents: a study with deaf adults7
A study on the attention of people with low vision to accessibility guidance signs7
Comparing head-mounted and handheld augmented reality for guided assembly7
HaM3D: generalized XR-based multimodal HRI framework with haptic feedback for industry 4.06
Studying human modality preferences in a human-drone framework for secondary task selection6
Correction: Comparing head-mounted and handheld augmented reality for guided assembly5
Three-dimensional sonification as a surgical guidance tool5
What is good? Exploring the applicability of a one item measure as a proxy for measuring acceptance in driver-vehicle interaction studies5
Ipsilateral and contralateral warnings: effects on decision-making and eye movements in near-collision scenarios5
A survey of challenges and methods for Quality of Experience assessment of interactive VR applications5
Exploring User Interactions with Commercial Machines via Real-world Application Logs in the Lab5
Exploring visual stimuli as a support for novices’ creative engagement with digital musical interfaces4
Identification of visual stimuli is improved by accompanying auditory stimuli through directing eye movement: an investigation in perceptual-cognitive skills4
The cognitive basis for virtual reality rehabilitation of upper-extremity motor function after neurotraumas4
A SLAM-based augmented reality app for the assessment of spatial short-term memory using visual and auditory stimuli4
In-vehicle nudging for increased Adaptive Cruise Control use: a field study4
0.036206960678101