Journal on Multimodal User Interfaces

Papers
(The median citation count of Journal on Multimodal User Interfaces is 1. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2021-04-01 to 2025-04-01.)
ArticleCitations
Comparing alternative modalities in the context of multimodal human–robot interaction48
Special issue on “User-centered advanced driver assistance systems (UCADAS)”16
Interactive exploration of a hierarchical spider web structure with sound14
A review on communication cues for augmented reality based remote guidance12
Correction to: Understanding virtual drilling perception using sound, and kinesthetic cues obtained with a mouse and keyboard10
Preliminary assessment of a multimodal electric-powered wheelchair simulator for training of activities of daily living10
SonAir: the design of a sonification of radar data for air traffic control9
Perceptually congruent sonification of auditory line charts8
Correction to: PepperOSC: enabling interactive sonification of a robot’s expressive movement7
Combining audio and visual displays to highlight temporal and spatial seismic patterns7
The effect of eye movement sonification on visual search patterns and anticipation in novices7
In-vehicle air gesture design: impacts of display modality and control orientation7
Importance of force feedback for following uneven virtual paths with a stylus6
Understanding virtual drilling perception using sound, and kinesthetic cues obtained with a mouse and keyboard5
Correction: Comparing head-mounted and handheld augmented reality for guided assembly5
Impact of communication modalities on social presence and regulation processes in a collaborative game5
A survey of challenges and methods for Quality of Experience assessment of interactive VR applications4
The effects of haptic, visual and olfactory augmentations on food consumed while wearing an extended reality headset4
What is good? Exploring the applicability of a one item measure as a proxy for measuring acceptance in driver-vehicle interaction studies4
Review of substitutive assistive tools and technologies for people with visual impairments: recent advancements and prospects4
Multimodal exploration in elementary music classroom3
Informing the design of a multisensory learning environment for elementary mathematics learning3
Combining haptics and inertial motion capture to enhance remote control of a dual-arm robot3
Correction to: A gaze-based interactive system to explore artwork imagery3
Three-dimensional sonification as a surgical guidance tool2
Pegasos: a framework for the creation of direct mobile coaching feedback systems2
Ipsilateral and contralateral warnings: effects on decision-making and eye movements in near-collision scenarios2
Commanding a drone through body poses, improving the user experience2
Truck drivers’ views on the road safety benefits of advanced driver assistance systems and Intelligent Transport Systems in Tanzania2
A social robot as your reading companion: exploring the relationships between gaze patterns and knowledge gains2
Human or robot? Exploring different avatar appearances to increase perceived security in shared automated vehicles2
Modelling the “transactive memory system” in multimodal multiparty interactions2
A low duration vibro-tactile representation of Braille characters2
Testing driver warning systems for off-road industrial vehicles using a cyber-physical simulator2
Gesture-based guidance for navigation in virtual environments2
Personality trait estimation in group discussions using multimodal analysis and speaker embedding2
Designing multi-purpose devices to enhance users’ perception of haptics2
Does mixed reality influence joint action? Impact of the mixed reality setup on users’ behavior and spatial interaction2
PepperOSC: enabling interactive sonification of a robot’s expressive movement1
Theory-based approach for assessing cognitive load during time-critical resource-managing human–computer interactions: an eye-tracking study1
The Audio-Corsi: an acoustic virtual reality-based technological solution for evaluating audio-spatial memory abilities1
The cognitive basis for virtual reality rehabilitation of upper-extremity motor function after neurotraumas1
AirWhisper: enhancing virtual reality experience via visual-airflow multimodal feedback1
Design and testing of (A)MICO: a multimodal feedback system to facilitate the interaction between cobot and human operator1
TapCAPTCHA: non-visual CAPTCHA on touchscreens for visually impaired people1
0.030643939971924