IEEE Transactions on Affective Computing

Papers
(The median citation count of IEEE Transactions on Affective Computing is 6. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2021-12-01 to 2025-12-01.)
ArticleCitations
From EEG to Eye Movements: Cross-Modal Emotion Recognition Using Constrained Adversarial Network With Dual Attention1008
Mouse-Cursor Tracking: Simple Scoring Algorithms That Make it Work580
Major Depressive Disorder Detection Using Graph Domain Adaptation With Global Message-Passing Based on EEG Signals512
CausalSymptom: Learning Causal Disentangled Representation for Depression Severity Estimation on Transcribed Clinical Interviews369
Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks300
Does Gamified Breath-Biofeedback Promote Adherence, Relaxation, and Skill Transfer in the Wild?258
A High-Quality Landmarked Infrared Eye Video Dataset (IREye4Task): Eye Behaviors, Insights and Benchmarks for Wearable Mental State Analysis250
Fusing of Electroencephalogram and Eye Movement With Group Sparse Canonical Correlation Analysis for Anxiety Detection242
Vision Transformer With Attentive Pooling for Robust Facial Expression Recognition240
Sparse Emotion Dictionary and CWT Spectrogram Fusion With Multi-Head Self-Attention for Depression Recognition in Parkinson's Disease Patients217
DGC-Link: Dual-Gate Chebyshev Linkage Network on EEG Emotion Recognition215
ATTSF-Net: Attention-Based Similarity Fusion Network for Audio-Visual Emotion Recognition211
CAETFN: Context Adaptively Enhanced Text-Guided Fusion Network for Multimodal Sentiment Analysis177
Perceived Conversation Quality in Spontaneous Interactions168
Towards Contrastive Context-Aware Conversational Emotion Recognition159
Classifying Suicide-Related Content and Emotions on Twitter Using Graph Convolutional Neural Networks158
Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter144
Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals144
Mechanoreceptive Aβ Primary Afferents Discriminate Naturalistic Social Touch Inputs at a Functionally Relevant Time Scale137
ECPEC: Emotion-Cause Pair Extraction in Conversations125
Progressive Masking Oriented Self-Taught Learning for Occluded Facial Expression Recognition123
Analyzing Emotions and Engagement During Cognitive Stimulation Group Training with the Pepper Robot116
The Acoustically Emotion-Aware Conversational Agent With Speech Emotion Recognition and Empathetic Responses116
Automatic Detection of Reflective Thinking in Mathematical Problem Solving Based on Unconstrained Bodily Exploration115
Enhancing Cross-Dataset EEG Emotion Recognition: A Novel Approach With Emotional EEG Style Transfer Network113
Brain-Machine Enhanced Intelligence for Semi-Supervised Facial Emotion Recognition111
A Classification Framework for Depressive Episode Using R-R Intervals From Smartwatch109
Real-World Classification of Student Stress and Fatigue Using Wearable PPG Recordings105
A Multi-Componential Approach to Emotion Recognition and the Effect of Personality99
Continuous Emotion Ambiguity Prediction: Modeling With Beta Distributions98
Smart Affect Monitoring With Wearables in the Wild: An Unobtrusive Mood-Aware Emotion Recognition System96
The Deep Method: Towards Computational Modeling of the Social Emotion Shame Driven by Theory, Introspection, and Social Signals95
When is a Haptic Message Like an Inside Joke? Digitally Mediated Emotive Communication Builds on Shared History93
EEG Microstates and fNIRS Metrics Reveal the Spatiotemporal Joint Neural Processing Features of Human Emotions92
Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game Users91
Subjective Fear in Virtual Reality: A Linear Mixed-Effects Analysis of Skin Conductance91
An Efficient Framework for Constructing Speech Emotion Corpus Based on Integrated Active Learning Strategies89
Are 3D Face Shapes Expressive Enough for Recognising Continuous Emotions and Action Unit Intensities?88
Hierarchical Multiscale Recurrent Neural Networks for Detecting Suicide Notes87
Learning Users Inner Thoughts and Emotion Changes for Social Media Based Suicide Risk Detection86
AM-ConvBLS: Adaptive Manifold Convolutional Broad Learning System for Cross-Session and Cross-Subject Emotion Recognition85
The Diagnosis Method of Major Depressive Disorder Using Wavelet Coherence and State-Pathology Separation Network83
PainFormer: A Vision Foundation Model for Automatic Pain Assessment83
Cluster-Level Contrastive Learning for Emotion Recognition in Conversations81
Facial Image-Based Automatic Assessment of Equine Pain81
MPRNet: A Temporal-Aware Cross-Modal Encoding Framework for Personality Recognition81
The Recognition of Multiple Anxiety Levels Based on Electroencephalograph80
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress75
Transformer-Based Self-Supervised Multimodal Representation Learning for Wearable Emotion Recognition72
Review on Psychological Stress Detection Using Biosignals71
SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals71
GHA: A Gated Hierarchical Attention Mechanism for the Detection of Abusive Language in Social Media68
Computation of Sensory-Affective Relationships Depending on Material Categories of Pictorial Stimuli66
Effects of Algorithmic Transparency on User Experience and Physiological Responses in Affect-Aware Task Adaptation65
Leveraging the Deep Learning Paradigm for Continuous Affect Estimation from Facial Expressions64
Towards Cyberbullying Detection: Building, Benchmarking and Longitudinal Analysis of Aggressiveness and Conflicts/Attacks Datasets From Twitter64
Non-Invasive Measurement of Trust in Group Interactions63
Age Against the Machine: How Age Relates to Listeners' Ability to Recognize Emotions in Robots' Semantic-Free Utterances62
An Enroll-to-Verify Approach for Cross-Task Unseen Emotion Class Recognition61
MECA: Manipulation With Emotional Intensity-Aware Contrastive Learning and Attention-Based Discriminative Learning61
Fusion and Discrimination: A Multimodal Graph Contrastive Learning Framework for Multimodal Sarcasm Detection60
Improved Video Emotion Recognition With Alignment of CNN and Human Brain Representations60
Rethinking Emotion Annotations in the Era of Large Language Models60
TFAGL: A Novel Agent Graph Learning Method Using Time-Frequency EEG for Major Depressive Disorder Detection58
SEED-VII: A Multimodal Dataset of Six Basic Emotions With Continuous Labels for Emotion Recognition58
miMamba: EEG-Based Emotion Recognition With Multi-Scale Inverted Mamba Models58
A Micro-Expression Recognition Network Based on Attention Mechanism and Motion Magnification58
Dynamic Confidence-Aware Multi-Modal Emotion Recognition57
LGSNet: A Two-Stream Network for Micro- and Macro-Expression Spotting With Background Modeling57
SMIN: Semi-Supervised Multi-Modal Interaction Network for Conversational Emotion Recognition57
Guest Editorial Extremely Low-Resource Autonomous Affective Learning56
Long Short-Term Memory Network Based Unobtrusive Workload Monitoring With Consumer Grade Smartwatches55
Nonverbal Leadership in Joint Full-Body Improvisation55
Annotate Smarter, not Harder: Using Active Learning to Reduce Emotional Annotation Effort54
AGILE: Attribute-Guided Identity Independent Learning for Facial Expression Recognition54
Emotion Recognition Using Affective Touch: A Survey53
Guest Editorial: Special Issue on Affective Speech and Language Synthesis, Generation, and Conversion52
How many raters do we need? Analyses of uncertainty in estimating ambiguity-aware emotion labels52
Exploring Complexity of Facial Dynamics in Autism Spectrum Disorder51
Partial Label Learning for Emotion Recognition From EEG51
Hierarchical Shared Encoder With Task-Specific Transformer Layer Selection for Emotion-Cause Pair Extraction51
A New Perspective on Stress Detection: An Automated Approach for Detecting Eustress and Distress51
Group Synchrony for Emotion Recognition Using Physiological Signals51
Facial Depression Recognition by Deep Joint Label Distribution and Metric Learning51
Few-Shot Learning in Emotion Recognition of Spontaneous Speech Using a Siamese Neural Network With Adaptive Sample Pair Formation51
Modeling Category Semantic and Sentiment Knowledge for Aspect-Level Sentiment Analysis50
Objective Class-Based Micro-Expression Recognition Under Partial Occlusion Via Region-Inspired Relation Reasoning Network49
Dynamic Micro-Expression Recognition Using Knowledge Distillation49
Multi-Party Conversation Modeling for Emotion Recognition46
Multi-Label and Multimodal Classifier for Affective States Recognition in Virtual Rehabilitation46
An Analysis of Physiological and Psychological Responses in Virtual Reality and Flat Screen Gaming46
Dynamical Causal Graph Neural Network for EEG Emotion Recognition45
Prediction of Depression Severity Based on the Prosodic and Semantic Features With Bidirectional LSTM and Time Distributed CNN45
Multimodal Deception Detection Using Real-Life Trial Data45
Domain-Incremental Continual Learning for Mitigating Bias in Facial Expression and Action Unit Recognition44
Emotion Transition Recognition Using Multimodal Physiological Signal Fusion43
From Regional to Global Brain: A Novel Hierarchical Spatial-Temporal Neural Network Model for EEG Emotion Recognition43
STAA-Net: A Sparse and Transferable Adversarial Attack for Speech Emotion Recognition42
Methodology to Assess Quality, Presence, Empathy, Attitude, and Attention in 360-degree Videos for Immersive Communications42
State-Specific and Supraordinal Components of Facial Response to Pain42
Aspect-Based Sentiment Quantification42
EEG Feature Selection via Global Redundancy Minimization for Emotion Recognition41
Classification of Interbeat Interval Time-Series Using Attention Entropy41
FERMixNet: An Occlusion Robust Facial Expression Recognition Model With Facial Mixing Augmentation and Mid-Level Representation Learning41
Emotion Expression in Human Body Posture and Movement: A Survey on Intelligible Motion Factors, Quantification and Validation41
EmoTake: Exploring Drivers’ Emotion for Takeover Behavior Prediction40
Text-Based Fine-Grained Emotion Prediction40
Deep Adaptation of Adult-Child Facial Expressions by Fusing Landmark Features40
MoDE: Improving Mixture of Depression Experts with Mutual Information Estimator for Depression Detection40
Hierarchical Encoding and Fusion of Brain Functions for Depression Subtype Classification40
Contradicted by the Brain: Predicting Individual and Group Preferences via Brain-Computer Interfacing39
The Rhythm of Flow: Detecting Facial Expressions of Flow Experiences Using CNNs39
Estimating Affective Taste Experience Using Combined Implicit Behavioral and Neurophysiological Measures39
MCGC-Net: Multi-scale Controllable Graph Convolutional Network on Music Emotion Recognition38
Ordinal Logistic Regression With Partial Proportional Odds for Depression Prediction38
Micro and Macro Facial Expression Recognition Using Advanced Local Motion Patterns38
Leveraging Social Media for Real-Time Interpretable and Amendable Suicide Risk Prediction With Human-in-The-Loop38
A Dual-Branch Dynamic Graph Convolution Based Adaptive TransFormer Feature Fusion Network for EEG Emotion Recognition38
Datasets of Smartphone Modalities for Depression Assessment: A Scoping Review38
Public Opinion Crisis Management via Social Media Mining38
MTADA: A Multi-Task Adversarial Domain Adaptation Network for EEG-Based Cross-Subject Emotion Recognition38
A Psychologically Inspired Fuzzy Cognitive Deep Learning Framework to Predict Crowd Behavior37
Theory of Mind Abilities Predict Robot's Gaze Effects on Object Preference37
Integrating Deep Facial Priors Into Landmarks for Privacy Preserving Multimodal Depression Recognition37
Survey of Deep Representation Learning for Speech Emotion Recognition37
Lexicon-Based Sentiment Convolutional Neural Networks for Online Review Analysis37
Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition36
Versatile Audio-Visual Learning for Emotion Recognition36
Geometric Graph Representation With Learnable Graph Structure and Adaptive AU Constraint for Micro-Expression Recognition36
Emotion Distribution Learning Based on Peripheral Physiological Signals36
Boosting Micro-Expression Recognition via Self-Expression Reconstruction and Memory Contrastive Learning35
Self-Supervised ECG Representation Learning for Emotion Recognition35
Stimulus-Response Pattern: The Core of Robust Cross-Stimulus Facial Depression Recognition35
From What You See to What We Smell: Linking Human Emotions to Bio-Markers in Breath35
Leveraging the Dynamics of Non-Verbal Behaviors For Social Attitude Modeling35
A Spontaneous Driver Emotion Facial Expression (DEFE) Dataset for Intelligent Vehicles: Emotions Triggered by Video-Audio Clips in Driving Scenarios35
Affective Touch via Haptic Interfaces: A Sequential Indentation Approach35
Fake News, Real Emotions: Emotion Analysis of COVID-19 Infodemic in Weibo35
Modeling Multimodal Depression Diagnosis from the Perspective of Local Depressive Representation34
I Enjoy Writing and Playing, Do You?: A Personalized and Emotion Grounded Dialogue Agent Using Generative Adversarial Network34
Capturing Dynamic Fear Experiences in Naturalistic Contexts: an Ecologically Valid fMRI Signature Integrating Brain Activation and Connectivity34
The Pixels and Sounds of Emotion: General-Purpose Representations of Arousal in Games33
Exploring Emotion Expression Recognition in Older Adults Interacting With a Virtual Coach33
Distant Handshakes: Conveying Social Intentions Through Multi-Modal Soft Haptic Gloves33
The Role of Preprocessing for Word Representation Learning in Affective Tasks33
Psychophysiological Reactions to Persuasive Messages Deploying Persuasion Principles33
CorMulT: A Semi-Supervised Modality Correlation-Aware Multimodal Transformer for Sentiment Analysis33
Progressive Multi-Source Domain Adaptation for Personalized Facial Expression Recognition33
Does Visual Self-Supervision Improve Learning of Speech Representations for Emotion Recognition?33
A Survey of Textual Emotion Recognition and Its Challenges31
Towards Emotion-Aware Agents for Improved User Satisfaction and Partner Perception in Negotiation Dialogues31
Comparative Analysis of Physiological and Speech Signals for State Anxiety Detection in University Students in STEM31
Emotion Recognition From Few-Channel EEG Signals by Integrating Deep Feature Aggregation and Transfer Learning31
Automatic Emotion Recognition in Clinical Scenario: A Systematic Review of Methods31
Efficient Multimodal Transformer With Dual-Level Feature Restoration for Robust Multimodal Sentiment Analysis30
Short and Long Range Relation Based Spatio-Temporal Transformer for Micro-Expression Recognition30
Behavioral and Physiological Signals-Based Deep Multimodal Approach for Mobile Emotion Recognition30
Incorporating Forthcoming Events and Personality Traits in Social Media Based Stress Prediction30
Semantic and Emotional Dual Channel for Emotion Recognition in Conversation30
An Effective 3D Text Recurrent Voting Generator for Metaverse30
SCARE: A Novel Framework to Enhance Chinese Harmful Memes Detection29
A Residual Multi-Scale Convolutional Neural Network With Transformers for Speech Emotion Recognition29
Emotion Dependent Domain Adaptation for Speech Driven Affective Facial Feature Synthesis29
Facial Expression Animation by Landmark Guided Residual Module29
Affective Dynamics and Cognition During Game-Based Learning28
Mutual Information Based Fusion Model (MIBFM): Mild Depression Recognition Using EEG and Pupil Area Signals28
Empathetic Response Generation Through Multi-Modality28
Hierarchical Knowledge Stripping for Multimodal Sentiment Analysis28
Emotions Like Human: Self-Supervised Emotion Label Augmentation for Emotion Recognition in Conversation28
Step-wise Prompting Meets Uncertainty-Aware Dynamic Fusion for Robust EEG-Visual Emotion Recognition28
LineConGraphs: Line Conversation Graphs for Effective Emotion Recognition Using Graph Neural Networks28
Multi-Modal Sarcasm Detection and Humor Classification in Code-Mixed Conversations28
DECEPTIcON: Bridging Gaps in In-the-Wild Deception Research28
Social Image–Text Sentiment Classification With Cross-Modal Consistency and Knowledge Distillation28
Multi-Order Networks for Action Unit Detection28
Learning to Rank Onset-Occurring-Offset Representations for Micro-Expression Recognition28
Emotion Intensity and its Control for Emotional Voice Conversion28
Unconstrained Facial Expression Recognition With No-Reference De-Elements Learning27
Collecting Mementos: A Multimodal Dataset for Context-Sensitive Modeling of Affect and Memory Processing in Responses to Videos27
VyaktitvaNirdharan: Multimodal Assessment of Personality and Trait Emotional Intelligence27
Recognizing, Fast and Slow: Complex Emotion Recognition With Facial Expression Detection and Remote Physiological Measurement27
Effects of Physiological Signals in Different Types of Multimodal Sentiment Estimation27
A Multi-Level Alignment and Cross-Modal Unified Semantic Graph Refinement Network for Conversational Emotion Recognition26
Interview-based Depression Detection Using LLM-based Text Restatement and Emotion Lexicon26
Persuasion-Induced Physiology as Predictor of Persuasion Effectiveness26
Detection and Identification of Choking Under Pressure in College Tennis Based Upon Physiological Parameters, Performance Patterns, and Game Statistics26
Deep Learning Techniques for Text-based Emotional Response Generation: A Systematic Review26
AT2GRU: A Human Emotion Recognition Model With Mitigated Device Heterogeneity26
Examining Emotion Perception Agreement in Live Music Performance26
Multi-Scale Hyperbolic Contrastive Learning for Cross-Subject EEG Emotion Recognition26
Deep Learning for Micro-Expression Recognition: A Survey26
Towards Participant-Independent Stress Detection Using Instrumented Peripherals26
Mind AI's Mind: A Clinically Aligned Explainable AI Pipeline for Depression Diagnosis via Large Language Models25
Multi-Stage Graph Fusion Networks for Major Depressive Disorder Diagnosis25
Autonomic Modulations to Cardiac Dynamics in Response to Affective Touch: Differences Between Social Touch and Self-Touch25
ENGAGE-DEM: A Model of Engagement of People With Dementia25
CSE-GResNet: A Simple and Highly Efficient Network for Facial Expression Recognition25
SynSem-ASTE: An Enhanced Multi-Encoder Network for Aspect Sentiment Triplet Extraction With Syntax and Semantics25
STREL - Naturalistic Dataset and Methods for Studying Mental Stress and Relaxation Patterns in Critical Leading Roles25
MDN: A Deep Maximization-Differentiation Network for Spatio-Temporal Depression Detection24
Effective Connectivity Based EEG Revealing the Inhibitory Deficits for Distracting Stimuli in Major Depression Disorders24
Affective-ROPTester: Capability and Bias Analysis of LLMs in Predicting Retinopathy of Prematurity24
Detecting Mental Disorders in Social Media Through Emotional Patterns - The Case of Anorexia and Depression23
Deep Multi-Task Multi-Label CNN for Effective Facial Attribute Classification23
Neuro or Symbolic? Fine-Tuned Transformer With Unsupervised LDA Topic Clustering for Text Sentiment Analysis23
Conveying Emotions Through Device-Initiated Touch23
LibEER: A Comprehensive Benchmark and Algorithm Library for EEG-Based Emotion Recognition23
Empathetic Conversational Systems: A Review of Current Advances, Gaps, and Opportunities22
Mental Stress Assessment in the Workplace: A Review22
Exploring Multivariate Dynamics of Emotions Through Time-Varying Self-Assessed Arousal and Valence Ratings22
EEG-Based Emotion Recognition via Channel-Wise Attention and Self Attention22
Beyond Overfitting: Doubly Adaptive Dropout for Generalizable AU Detection22
Guest Editorial Neurosymbolic AI for Sentiment Analysis22
Unsupervised Time-Aware Sampling Network With Deep Reinforcement Learning for EEG-Based Emotion Recognition22
AL-HCL: Active Learning and Hierarchical Contrastive Learning for Multimodal Sentiment Analysis with Fusion Guidance21
Enhancing EEG-Based Decision-Making Performance Prediction by Maximizing Mutual Information Between Emotion and Decision-Relevant Features21
Facial Expression Recognition With Vision Transformer Using Fused Shifted Windows21
From Translation to Generative LLMs: Classification of Code-Mixed Affective Tasks21
ER-Chat: A Text-to-Text Open-Domain Dialogue Framework for Emotion Regulation21
Investigating Cardiovascular Activation of Young Adults in Routine Driving21
Towards Multimodal Prediction of Spontaneous Humor: A Novel Dataset and First Results20
Aspect-Opinion Correlation Aware and Knowledge-Expansion Few Shot Cross-Domain Sentiment Classification20
Enhancing Emotional Congruence in Sensory Substitution20
Editorial20
EEG-Based Emotion Recognition via Neural Architecture Search20
From the Lab to the Wild: Affect Modeling Via Privileged Information20
Analyzing the Visual Road Scene for Driver Stress Estimation20
A Reinforcement Learning Based Two-Stage Model for Emotion Cause Pair Extraction20
Improving Multi-Label Facial Expression Recognition With Consistent and Distinct Attentions20
SeeNet: A Soft Emotion Expert and Data Augmentation Method to Enhance Speech Emotion Recognition20
Quantitative Personality Predictions From a Brief EEG Recording19
Cross-Task and Cross-Participant Classification of Cognitive Load in an Emergency Simulation Game19
Investigating the Effects of Sleep Conditions on Emotion Responses with EEG Signals and Eye Movements19
Meta-Based Self-Training and Re-Weighting for Aspect-Based Sentiment Analysis19
Eye Action Units as Combinations of Discrete Eye Behaviors for Wearable Mental State Analysis19
Neurofeedback Training With an Electroencephalogram-Based Brain-Computer Interface Enhances Emotion Regulation19
RVISA: Reasoning and Verification for Implicit Sentiment Analysis19
Facial Expression Recognition in the Wild Using Multi-Level Features and Attention Mechanisms19
Editorial: Special Issue on Unobtrusive Physiological Measurement Methods for Affective Applications18
DBATES: Dataset for Discerning Benefits of Audio, Textual, and Facial Expression Features in Competitive Debate Speeches18
Graph-Based Facial Affect Analysis: A Review18
Aspect-Based Sentiment Analysis with New Target Representation and Dependency Attention18
Immersion Measurement in Watching Videos Using Eye-tracking Data18
The ForDigitStress Dataset: A Multi-Modal Dataset for Automatic Stress Recognition18
Chronic Stress Recognition Based on Time-Slot Analysis of Ambulatory Electrocardiogram and Tri-Axial Acceleration18
Modeling Vocal Entrainment in Conversational Speech Using Deep Unsupervised Learning18
MPFNet: A Multi-Prior Fusion Network With a Progressive Training Strategy for Micro-Expression Recognition18
Quantifying Emotional Similarity in Speech18
A Multi-Stage Visual Perception Approach for Image Emotion Analysis18
Deep Temporal Analysis for Non-Acted Body Affect Recognition18
0.58347392082214