IEEE Transactions on Affective Computing

Papers
(The H4-Index of IEEE Transactions on Affective Computing is 55. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2021-06-01 to 2025-06-01.)
ArticleCitations
Effects of Computerized Emotional Training on Children with High Functioning Autism841
Major Depressive Disorder Detection Using Graph Domain Adaptation With Global Message-Passing Based on EEG Signals452
From EEG to Eye Movements: Cross-modal Emotion Recognition Using Constrained Adversarial Network with Dual Attention437
Progressive Masking Oriented Self-Taught Learning for Occluded Facial Expression Recognition290
Mechanoreceptive Aβ Primary Afferents Discriminate Naturalistic Social Touch Inputs at a Functionally Relevant Time Scale264
Mouse-cursor Tracking: Simple Scoring Algorithms That Make It Work203
Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter199
Perceived Conversation Quality in Spontaneous Interactions197
Does Gamified Breath-Biofeedback Promote Adherence, Relaxation, and Skill Transfer in the Wild?196
Fusing of Electroencephalogram and Eye Movement With Group Sparse Canonical Correlation Analysis for Anxiety Detection176
Classifying Suicide-Related Content and Emotions on Twitter Using Graph Convolutional Neural Networks166
A High-Quality Landmarked Infrared Eye Video Dataset (IREye4Task): Eye Behaviors, Insights and Benchmarks for Wearable Mental State Analysis160
Towards Contrastive Context-Aware Conversational Emotion Recognition148
Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks143
Vision Transformer With Attentive Pooling for Robust Facial Expression Recognition138
ECPEC: Emotion-Cause Pair Extraction in Conversations129
Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals124
The Acoustically Emotion-Aware Conversational Agent With Speech Emotion Recognition and Empathetic Responses115
Sparse Emotion Dictionary and CWT Spectrogram Fusion With Multi-Head Self-Attention for Depression Recognition in Parkinson's Disease Patients114
Smart Affect Monitoring With Wearables in the Wild: An Unobtrusive Mood-Aware Emotion Recognition System108
EEG Microstates and fNIRS Metrics Reveal the Spatiotemporal Joint Neural Processing Features of Human Emotions94
An Efficient Framework for Constructing Speech Emotion Corpus Based on Integrated Active Learning Strategies92
Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game Users90
A Classification Framework for Depressive Episode Using R-R Intervals From Smartwatch88
The Deep Method: Towards Computational Modeling of the Social Emotion Shame Driven by Theory, Introspection, and Social Signals85
When is a Haptic Message Like an Inside Joke? Digitally Mediated Emotive Communication Builds on Shared History84
Continuous Emotion Ambiguity Prediction: Modeling With Beta Distributions82
Using Circular Models to Improve Music Emotion Recognition81
Are 3D Face Shapes Expressive Enough for Recognising Continuous Emotions and Action Unit Intensities?78
Enhancing Cross-Dataset EEG Emotion Recognition: A Novel Approach with Emotional EEG Style Transfer Network78
Hierarchical Multiscale Recurrent Neural Networks for Detecting Suicide Notes77
Automatic Detection of Reflective Thinking in Mathematical Problem Solving Based on Unconstrained Bodily Exploration77
AM-ConvBLS: Adaptive Manifold Convolutional Broad Learning System for Cross-Session and Cross-Subject Emotion Recognition76
Transformer-Based Self-Supervised Multimodal Representation Learning for Wearable Emotion Recognition75
The Recognition of Multiple Anxiety Levels Based on Electroencephalograph73
Facial Image-Based Automatic Assessment of Equine Pain73
Subjective Fear in Virtual Reality: A Linear Mixed-Effects Analysis of Skin Conductance72
Cluster-Level Contrastive Learning for Emotion Recognition in Conversations71
Review on Psychological Stress Detection Using Biosignals70
A Multi-Componential Approach to Emotion Recognition and the Effect of Personality69
SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals68
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress66
LGSNet: A Two-Stream Network for Micro- and Macro-Expression Spotting With Background Modeling65
Learning Users Inner Thoughts and Emotion Changes for Social Media Based Suicide Risk Detection65
TFAGL: A Novel Agent Graph Learning Method Using Time-Frequency EEG For Major Depressive Disorder Detection64
Non-Invasive Measurement of Trust in Group Interactions62
Improved Video Emotion Recognition With Alignment of CNN and Human Brain Representations62
A Micro-Expression Recognition Network Based on Attention Mechanism and Motion Magnification60
Leveraging the Deep Learning Paradigm for Continuous Affect Estimation from Facial Expressions60
Effects of Algorithmic Transparency on User Experience and Physiological Responses in Affect-Aware Task Adaptation59
Dynamic Confidence-Aware Multi-Modal Emotion Recognition59
Towards Cyberbullying Detection: Building, Benchmarking and Longitudinal Analysis of Aggressiveness and Conflicts/Attacks Datasets from Twitter59
Age Against the Machine: How Age Relates to Listeners' Ability to Recognize Emotions in Robots' Semantic-Free Utterances58
An Enroll-to-Verify Approach for Cross-Task Unseen Emotion Class Recognition58
SMIN: Semi-Supervised Multi-Modal Interaction Network for Conversational Emotion Recognition56
SEED-VII: A Multimodal Dataset of Six Basic Emotions With Continuous Labels for Emotion Recognition55
0.051711082458496