Journal on Multimodal User Interfaces

Papers
(The TQCC of Journal on Multimodal User Interfaces is 3. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2021-08-01 to 2025-08-01.)
ArticleCitations
Gesture-based guidance for navigation in virtual environments59
Human or robot? Exploring different avatar appearances to increase perceived security in shared automated vehicles29
Combining haptics and inertial motion capture to enhance remote control of a dual-arm robot17
Vis-Assist: computer vision and haptic feedback-based wearable assistive device for visually impaired14
A study on the attention of people with low vision to accessibility guidance signs13
A low duration vibro-tactile representation of Braille characters13
TapCAPTCHA: non-visual CAPTCHA on touchscreens for visually impaired people12
The Audio-Corsi: an acoustic virtual reality-based technological solution for evaluating audio-spatial memory abilities9
Prediction of pedestrian crossing behaviour at unsignalized intersections using machine learning algorithms: analysis and comparison8
Assessment of comparative evaluation techniques for signing agents: a study with deaf adults7
Comparing head-mounted and handheld augmented reality for guided assembly6
Importance of force feedback for following uneven virtual paths with a stylus6
TapCalculator: nonvisual touchscreen calculator for visually impaired people preliminary user study6
HaM3D: generalized XR-based multimodal HRI framework with haptic feedback for industry 4.06
Correction: Comparing head-mounted and handheld augmented reality for guided assembly5
What is good? Exploring the applicability of a one item measure as a proxy for measuring acceptance in driver-vehicle interaction studies5
A survey of challenges and methods for Quality of Experience assessment of interactive VR applications4
Exploring User Interactions with Commercial Machines via Real-world Application Logs in the Lab4
Ipsilateral and contralateral warnings: effects on decision-making and eye movements in near-collision scenarios4
Three-dimensional sonification as a surgical guidance tool4
In-vehicle nudging for increased Adaptive Cruise Control use: a field study3
SonAir: the design of a sonification of radar data for air traffic control3
A SLAM-based augmented reality app for the assessment of spatial short-term memory using visual and auditory stimuli3
Comparing alternative modalities in the context of multimodal human–robot interaction3
Identification of visual stimuli is improved by accompanying auditory stimuli through directing eye movement: an investigation in perceptual-cognitive skills3
The cognitive basis for virtual reality rehabilitation of upper-extremity motor function after neurotraumas3
Correction to: PepperOSC: enabling interactive sonification of a robot’s expressive movement3
Exploring visual stimuli as a support for novices’ creative engagement with digital musical interfaces3
0.035968065261841