IEEE Transactions on Affective Computing

Papers
(The H4-Index of IEEE Transactions on Affective Computing is 58. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2021-11-01 to 2025-11-01.)
ArticleCitations
CAETFN: Context Adaptively Enhanced Text-Guided Fusion Network for Multimodal Sentiment Analysis977
ATTSF-Net: Attention-based Similarity Fusion Network for Audio-Visual Emotion Recognition547
Mechanoreceptive Aβ Primary Afferents Discriminate Naturalistic Social Touch Inputs at a Functionally Relevant Time Scale501
ECPEC: Emotion-Cause Pair Extraction in Conversations355
Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter282
From EEG to Eye Movements: Cross-Modal Emotion Recognition Using Constrained Adversarial Network With Dual Attention241
Mouse-Cursor Tracking: Simple Scoring Algorithms That Make it Work238
Major Depressive Disorder Detection Using Graph Domain Adaptation With Global Message-Passing Based on EEG Signals236
Progressive Masking Oriented Self-Taught Learning for Occluded Facial Expression Recognition230
Classifying Suicide-Related Content and Emotions on Twitter Using Graph Convolutional Neural Networks207
Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals203
CausalSymptom: Learning Causal Disentangled Representation for Depression Severity Estimation on Transcribed Clinical Interviews194
Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks168
Does Gamified Breath-Biofeedback Promote Adherence, Relaxation, and Skill Transfer in the Wild?158
Sparse Emotion Dictionary and CWT Spectrogram Fusion With Multi-Head Self-Attention for Depression Recognition in Parkinson's Disease Patients155
DGC-Link: Dual-Gate Chebyshev Linkage Network on EEG Emotion Recognition143
Fusing of Electroencephalogram and Eye Movement With Group Sparse Canonical Correlation Analysis for Anxiety Detection141
Perceived Conversation Quality in Spontaneous Interactions133
A High-Quality Landmarked Infrared Eye Video Dataset (IREye4Task): Eye Behaviors, Insights and Benchmarks for Wearable Mental State Analysis123
The Acoustically Emotion-Aware Conversational Agent With Speech Emotion Recognition and Empathetic Responses112
Towards Contrastive Context-Aware Conversational Emotion Recognition111
Analyzing Emotions and Engagement During Cognitive Stimulation Group Training with the Pepper Robot110
Vision Transformer With Attentive Pooling for Robust Facial Expression Recognition110
Automatic Detection of Reflective Thinking in Mathematical Problem Solving Based on Unconstrained Bodily Exploration109
AM-ConvBLS: Adaptive Manifold Convolutional Broad Learning System for Cross-Session and Cross-Subject Emotion Recognition105
Enhancing Cross-Dataset EEG Emotion Recognition: A Novel Approach With Emotional EEG Style Transfer Network102
Brain-Machine Enhanced Intelligence for Semi-Supervised Facial Emotion Recognition101
Subjective Fear in Virtual Reality: A Linear Mixed-Effects Analysis of Skin Conductance98
An Efficient Framework for Constructing Speech Emotion Corpus Based on Integrated Active Learning Strategies95
A Classification Framework for Depressive Episode Using R-R Intervals From Smartwatch94
Continuous Emotion Ambiguity Prediction: Modeling With Beta Distributions91
MPRNet: a Temporal-Aware Cross-Modal Encoding Framework for Personality Recognition90
The Diagnosis Method of Major Depressive Disorder Using Wavelet Coherence and State-Pathology Separation Network89
Facial Image-Based Automatic Assessment of Equine Pain86
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress85
Smart Affect Monitoring With Wearables in the Wild: An Unobtrusive Mood-Aware Emotion Recognition System84
Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game Users84
SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals83
Cluster-Level Contrastive Learning for Emotion Recognition in Conversations82
The Recognition of Multiple Anxiety Levels Based on Electroencephalograph82
The Deep Method: Towards Computational Modeling of the Social Emotion Shame Driven by Theory, Introspection, and Social Signals81
Are 3D Face Shapes Expressive Enough for Recognising Continuous Emotions and Action Unit Intensities?81
When is a Haptic Message Like an Inside Joke? Digitally Mediated Emotive Communication Builds on Shared History81
Transformer-Based Self-Supervised Multimodal Representation Learning for Wearable Emotion Recognition79
Hierarchical Multiscale Recurrent Neural Networks for Detecting Suicide Notes79
PainFormer: a Vision Foundation Model for Automatic Pain Assessment77
A Multi-Componential Approach to Emotion Recognition and the Effect of Personality76
Learning Users Inner Thoughts and Emotion Changes for Social Media Based Suicide Risk Detection70
EEG Microstates and fNIRS Metrics Reveal the Spatiotemporal Joint Neural Processing Features of Human Emotions69
Review on Psychological Stress Detection Using Biosignals68
GHA: A Gated Hierarchical Attention Mechanism for the Detection of Abusive Language in Social Media67
Computation of Sensory-Affective Relationships Depending on Material Categories of Pictorial Stimuli67
Effects of Algorithmic Transparency on User Experience and Physiological Responses in Affect-Aware Task Adaptation64
Towards Cyberbullying Detection: Building, Benchmarking and Longitudinal Analysis of Aggressiveness and Conflicts/Attacks Datasets From Twitter63
TFAGL: A Novel Agent Graph Learning Method Using Time-Frequency EEG for Major Depressive Disorder Detection62
miMamba: EEG-based Emotion Recognition with Multi-scale Inverted Mamba Models61
SEED-VII: A Multimodal Dataset of Six Basic Emotions With Continuous Labels for Emotion Recognition61
Leveraging the Deep Learning Paradigm for Continuous Affect Estimation from Facial Expressions60
Non-Invasive Measurement of Trust in Group Interactions58
0.23270797729492