IEEE Transactions on Affective Computing

Papers
(The H4-Index of IEEE Transactions on Affective Computing is 52. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2021-04-01 to 2025-04-01.)
ArticleCitations
2021 Index IEEE Transactions on Affective Computing Vol. 12792
Progressive Masking Oriented Self-Taught Learning for Occluded Facial Expression Recognition421
A Comparative Data-Driven Study of Intensity-Based Categorical Emotion Representations for MER413
Does Gamified Breath-Biofeedback Promote Adherence, Relaxation, and Skill Transfer in the Wild?330
Cross-Domain Sentiment Analysis via Disentangled Representation and Prototypical Learning264
Mechanoreceptive Aβ Primary Afferents Discriminate Naturalistic Social Touch Inputs at a Functionally Relevant Time Scale245
Multimodal Sentimental Privileged Information Embedding for Improving Facial Expression Recognition236
PeTracker: Poincaré-Based Dual-Strategy Emotion Tracker for Emotion Recognition in Conversation205
A Region Group Adaptive Attention Model For Subtle Expression Recognition193
Capturing Emotion Distribution for Multimedia Emotion Tagging180
Affective Dynamics: Principal Motion Analysis of Temporal Dominance of Sensations and Emotions Data179
LineConGraphs: Line Conversation Graphs for Effective Emotion Recognition using Graph Neural Networks174
Multi-Scale Hyperbolic Contrastive Learning for Cross-Subject EEG Emotion Recognition163
Multi-Modal Sarcasm Detection and Humor Classification in Code-Mixed Conversations154
Text-guided Reconstruction Network for Sentiment Analysis with Uncertain Missing Modalities145
Mouse-cursor Tracking: Simple Scoring Algorithms That Make It Work136
Unconstrained Facial Expression Recognition With No-Reference De-Elements Learning134
Variational Instance-Adaptive Graph for EEG Emotion Recognition133
First Impressions: A Survey on Vision-Based Apparent Personality Trait Analysis129
Multiple Instance Learning for Emotion Recognition Using Physiological Signals117
SCARE: A Novel Framework to Enhance Chinese Harmful Memes Detection114
VyaktitvaNirdharan: Multimodal Assessment of Personality and Trait Emotional Intelligence106
Sparse Emotion Dictionary and CWT Spectrogram Fusion with Multi-head Self-Attention for Depression Recognition in Parkinson's Disease Patients96
Bodily Sensation Map vs. Bodily Motion Map: Visualizing and Analyzing Emotional Body Motions94
IMGWOFS: A Feature Selector with Trade-off between Conflict Objectives for EEG-based Emotion Recognition86
Hierarchical Knowledge Stripping for Multimodal Sentiment Analysis80
Individual-Aware Attention Modulation for Unseen Speaker Emotion Recognition80
A Residual Multi-Scale Convolutional Neural Network with Transformers for Speech Emotion Recognition76
Affective Computing Databases: In-depth Analysis of Systematic Reviews and Surveys76
Major Depressive Disorder Detection Using Graph Domain Adaptation With Global Message-Passing Based on EEG Signals76
Facial Expressions of Comprehension (FEC)75
From EEG to Eye Movements: Cross-modal Emotion Recognition Using Constrained Adversarial Network with Dual Attention75
Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks74
Modeling Emotion in Complex Stories: The Stanford Emotional Narratives Dataset74
Emotion Recognition and EEG Analysis Using ADMM-Based Sparse Group Lasso71
“Emotions are the Great Captains of Our Lives”: Measuring Moods Through the Power of Physiological and Environmental Sensing71
Early Detection of User Engagement Breakdown in Spontaneous Human-Humanoid Interaction70
Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals69
Speech-Driven Expressive Talking Lips with Conditional Sequential Generative Adversarial Networks68
Multimodal Self-Assessed Personality Estimation During Crowded Mingle Scenarios Using Wearables Devices and Cameras68
Effects of Computerized Emotional Training on Children with High Functioning Autism65
Doing and Feeling: Relationships Between Moods, Productivity and Task-Switching64
Receiving a Mediated Touch From Your Partner vs. a Male Stranger: How Visual Feedback of Touch and Its Sender Influence Touch Experience63
Collecting Mementos: A Multimodal Dataset for Context-Sensitive Modeling of Affect and Memory Processing in Responses to Videos63
The Mediating Effect of Emotions on Trust in the Context of Automated System Usage60
Exploiting Evolutionary Algorithms to Model Nonverbal Reactions to Conversational Interruptions in User-Agent Interactions59
Unsupervised Cross-Corpus Speech Emotion Recognition Using a Multi-Source Cycle-GAN59
Emotion Dependent Domain Adaptation for Speech Driven Affective Facial Feature Synthesis58
Fusing of Electroencephalogram and Eye Movement With Group Sparse Canonical Correlation Analysis for Anxiety Detection58
Towards a Prediction and Data Driven Computational Process Model of Emotion56
BReG-NeXt: Facial Affect Computing Using Adaptive Residual Networks With Bounded Gradient54
Mutual Information Based Fusion Model (MIBFM): Mild Depression Recognition Using EEG and Pupil Area Signals53
Capturing Interaction Quality in Long Duration (Simulated) Space Missions With Wearables52
Incorporating Forthcoming Events and Personality Traits in Social Media Based Stress Prediction52
To Be or Not to Be in Flow at Work: Physiological Classification of Flow Using Machine Learning52
TSSRD: A Topic Sentiment Summarization Framework Based on Reaching Definition52
A Framework to Model and Control the State of Presence in Virtual Reality Systems52
Automatic Estimation of Self-Reported Pain by Trajectory Analysis in the Manifold of Fixed Rank Positive Semi-Definite Matrices52
0.11778593063354