IEEE Transactions on Affective Computing

Papers
(The median citation count of IEEE Transactions on Affective Computing is 6. The table below lists those papers that are above that threshold based on CrossRef citation counts [max. 250 papers]. The publications cover those that have been published in the past four years, i.e., from 2021-08-01 to 2025-08-01.)
ArticleCitations
Major Depressive Disorder Detection Using Graph Domain Adaptation With Global Message-Passing Based on EEG Signals901
From EEG to Eye Movements: Cross-modal Emotion Recognition Using Constrained Adversarial Network with Dual Attention489
Progressive Masking Oriented Self-Taught Learning for Occluded Facial Expression Recognition458
A High-Quality Landmarked Infrared Eye Video Dataset (IREye4Task): Eye Behaviors, Insights and Benchmarks for Wearable Mental State Analysis318
Mouse-cursor Tracking: Simple Scoring Algorithms That Make It Work271
Effects of Computerized Emotional Training on Children with High Functioning Autism217
ATTSF-Net: Attention-based Similarity Fusion Network for Audio-Visual Emotion Recognition216
Mechanoreceptive Aβ Primary Afferents Discriminate Naturalistic Social Touch Inputs at a Functionally Relevant Time Scale214
Sparse Emotion Dictionary and CWT Spectrogram Fusion With Multi-Head Self-Attention for Depression Recognition in Parkinson's Disease Patients213
Perceived Conversation Quality in Spontaneous Interactions188
Does Gamified Breath-Biofeedback Promote Adherence, Relaxation, and Skill Transfer in the Wild?183
Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals169
Classifying Suicide-Related Content and Emotions on Twitter Using Graph Convolutional Neural Networks157
The Acoustically Emotion-Aware Conversational Agent With Speech Emotion Recognition and Empathetic Responses153
Towards Contrastive Context-Aware Conversational Emotion Recognition147
Fusing of Electroencephalogram and Eye Movement With Group Sparse Canonical Correlation Analysis for Anxiety Detection140
Vision Transformer With Attentive Pooling for Robust Facial Expression Recognition128
Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks118
Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter116
ECPEC: Emotion-Cause Pair Extraction in Conversations112
A Classification Framework for Depressive Episode Using R-R Intervals From Smartwatch98
An Efficient Framework for Constructing Speech Emotion Corpus Based on Integrated Active Learning Strategies97
The Deep Method: Towards Computational Modeling of the Social Emotion Shame Driven by Theory, Introspection, and Social Signals97
When is a Haptic Message Like an Inside Joke? Digitally Mediated Emotive Communication Builds on Shared History96
Facial Image-Based Automatic Assessment of Equine Pain93
Continuous Emotion Ambiguity Prediction: Modeling With Beta Distributions90
Enhancing Cross-Dataset EEG Emotion Recognition: A Novel Approach with Emotional EEG Style Transfer Network88
Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game Users85
Subjective Fear in Virtual Reality: A Linear Mixed-Effects Analysis of Skin Conductance83
AM-ConvBLS: Adaptive Manifold Convolutional Broad Learning System for Cross-Session and Cross-Subject Emotion Recognition83
Are 3D Face Shapes Expressive Enough for Recognising Continuous Emotions and Action Unit Intensities?82
Automatic Detection of Reflective Thinking in Mathematical Problem Solving Based on Unconstrained Bodily Exploration82
Hierarchical Multiscale Recurrent Neural Networks for Detecting Suicide Notes82
A Multi-Componential Approach to Emotion Recognition and the Effect of Personality81
Smart Affect Monitoring With Wearables in the Wild: An Unobtrusive Mood-Aware Emotion Recognition System79
Cluster-Level Contrastive Learning for Emotion Recognition in Conversations78
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress78
EEG Microstates and fNIRS Metrics Reveal the Spatiotemporal Joint Neural Processing Features of Human Emotions77
The Recognition of Multiple Anxiety Levels Based on Electroencephalograph76
Learning Users Inner Thoughts and Emotion Changes for Social Media Based Suicide Risk Detection75
Transformer-Based Self-Supervised Multimodal Representation Learning for Wearable Emotion Recognition72
SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals71
Review on Psychological Stress Detection Using Biosignals70
GHA: A Gated Hierarchical Attention Mechanism for the Detection of Abusive Language in Social Media69
Improved Video Emotion Recognition With Alignment of CNN and Human Brain Representations68
SMIN: Semi-Supervised Multi-Modal Interaction Network for Conversational Emotion Recognition67
Fusion and Discrimination: A Multimodal Graph Contrastive Learning Framework for Multimodal Sarcasm Detection67
LGSNet: A Two-Stream Network for Micro- and Macro-Expression Spotting With Background Modeling66
Non-Invasive Measurement of Trust in Group Interactions65
Leveraging the Deep Learning Paradigm for Continuous Affect Estimation from Facial Expressions64
A Micro-Expression Recognition Network Based on Attention Mechanism and Motion Magnification64
Towards Cyberbullying Detection: Building, Benchmarking and Longitudinal Analysis of Aggressiveness and Conflicts/Attacks Datasets from Twitter62
Effects of Algorithmic Transparency on User Experience and Physiological Responses in Affect-Aware Task Adaptation60
An Enroll-to-Verify Approach for Cross-Task Unseen Emotion Class Recognition59
Age Against the Machine: How Age Relates to Listeners' Ability to Recognize Emotions in Robots' Semantic-Free Utterances58
MECA: Manipulation With Emotional Intensity-Aware Contrastive Learning and Attention-Based Discriminative Learning57
Computation of Sensory-Affective Relationships Depending on Material Categories of Pictorial Stimuli57
SEED-VII: A Multimodal Dataset of Six Basic Emotions With Continuous Labels for Emotion Recognition57
Dynamic Confidence-Aware Multi-Modal Emotion Recognition56
TFAGL: A Novel Agent Graph Learning Method Using Time-Frequency EEG For Major Depressive Disorder Detection56
Rethinking Emotion Annotations in the Era of Large Language Models55
Multi-Party Conversation Modeling for Emotion Recognition54
miMamba: EEG-based Emotion Recognition with Multi-scale Inverted Mamba Models54
AGILE: Attribute-Guided Identity Independent Learning for Facial Expression Recognition51
Annotate Smarter, not Harder: Using Active Learning to Reduce Emotional Annotation Effort51
Dynamical Causal Graph Neural Network for EEG Emotion Recognition51
Methodology to Assess Quality, Presence, Empathy, Attitude, and Attention in 360-degree Videos for Immersive Communications50
Exploring Complexity of Facial Dynamics in Autism Spectrum Disorder50
Partial Label Learning for Emotion Recognition from EEG49
An Analysis of Physiological and Psychological Responses in Virtual Reality and Flat Screen Gaming48
Long Short-Term Memory Network Based Unobtrusive Workload Monitoring With Consumer Grade Smartwatches48
Objective Class-Based Micro-Expression Recognition Under Partial Occlusion Via Region-Inspired Relation Reasoning Network47
Prediction of Depression Severity Based on the Prosodic and Semantic Features With Bidirectional LSTM and Time Distributed CNN47
Guest Editorial: Special Issue on Affective Speech and Language Synthesis, Generation, and Conversion47
Nonverbal Leadership in Joint Full-Body Improvisation47
Modeling Category Semantic and Sentiment Knowledge for Aspect-Level Sentiment Analysis46
Multimodal Deception Detection Using Real-Life Trial Data46
Facial Depression Recognition by Deep Joint Label Distribution and Metric Learning46
Few-Shot Learning in Emotion Recognition of Spontaneous Speech Using a Siamese Neural Network With Adaptive Sample Pair Formation44
Hierarchical Shared Encoder With Task-Specific Transformer Layer Selection for Emotion-Cause Pair Extraction44
Multi-Label and Multimodal Classifier for Affective States Recognition in Virtual Rehabilitation44
Group Synchrony for Emotion Recognition Using Physiological Signals44
Dynamic Micro-Expression Recognition Using Knowledge Distillation43
Domain-Incremental Continual Learning for Mitigating Bias in Facial Expression and Action Unit Recognition42
A New Perspective on Stress Detection: An Automated Approach for Detecting Eustress and Distress42
Improving Cross-Corpus Speech Emotion Recognition with Adversarial Discriminative Domain Generalization (ADDoG)42
From Regional to Global Brain: A Novel Hierarchical Spatial-Temporal Neural Network Model for EEG Emotion Recognition41
State-Specific and Supraordinal Components of Facial Response to Pain41
Deep Adaptation of Adult-Child Facial Expressions by Fusing Landmark Features40
Theory of Mind Abilities Predict Robot's Gaze Effects on Object Preference40
Aspect-Based Sentiment Quantification40
Ordinal Logistic Regression With Partial Proportional Odds for Depression Prediction39
Survey of Deep Representation Learning for Speech Emotion Recognition39
Self-Supervised ECG Representation Learning for Emotion Recognition38
Micro and Macro Facial Expression Recognition Using Advanced Local Motion Patterns38
Contradicted by the Brain: Predicting Individual and Group Preferences via Brain-Computer Interfacing37
Versatile Audio-Visual Learning for Emotion Recognition37
Integrating Deep Facial Priors Into Landmarks for Privacy Preserving Multimodal Depression Recognition36
Lexicon-Based Sentiment Convolutional Neural Networks for Online Review Analysis36
Datasets of Smartphone Modalities for Depression Assessment: A Scoping Review35
Leveraging Social Media for Real-Time Interpretable and Amendable Suicide Risk Prediction With Human-in-The-Loop35
Geometric Graph Representation With Learnable Graph Structure and Adaptive AU Constraint for Micro-Expression Recognition35
A Psychologically Inspired Fuzzy Cognitive Deep Learning Framework to Predict Crowd Behavior35
Public Opinion Crisis Management Via Social Media Mining35
Emotion Expression in Human Body Posture and Movement: A Survey on Intelligible Motion Factors, Quantification and Validation35
Estimating Affective Taste Experience Using Combined Implicit Behavioral and Neurophysiological Measures35
EEG Feature Selection via Global Redundancy Minimization for Emotion Recognition35
Boosting Micro-Expression Recognition via Self-Expression Reconstruction and Memory Contrastive Learning35
STAA-Net: A Sparse and Transferable Adversarial Attack for Speech Emotion Recognition35
EmoTake: Exploring Drivers’ Emotion for Takeover Behavior Prediction34
Text-Based Fine-Grained Emotion Prediction34
A Spontaneous Driver Emotion Facial Expression (DEFE) Dataset for Intelligent Vehicles: Emotions Triggered by Video-Audio Clips in Driving Scenarios34
Hierarchical Encoding and Fusion of Brain Functions for Depression Subtype Classification34
FERMixNet: An Occlusion Robust Facial Expression Recognition Model With Facial Mixing Augmentation and Mid-Level Representation Learning33
Emotion Distribution Learning Based on Peripheral Physiological Signals33
Classification of Interbeat Interval Time-Series Using Attention Entropy33
The Rhythm of Flow: Detecting Facial Expressions of Flow Experiences Using CNNs33
A Dual-Branch Dynamic Graph Convolution Based Adaptive TransFormer Feature Fusion Network for EEG Emotion Recognition32
Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition32
Semantic and Emotional Dual Channel for Emotion Recognition in Conversation31
Leveraging the Dynamics of Non-Verbal Behaviors For Social Attitude Modeling31
I Enjoy Writing and Playing, Do You?: A Personalized and Emotion Grounded Dialogue Agent Using Generative Adversarial Network31
Psychophysiological Reactions to Persuasive Messages Deploying Persuasion Principles31
From What You See to What We Smell: Linking Human Emotions to Bio-Markers in Breath31
Distant Handshakes: Conveying Social Intentions Through Multi-Modal Soft Haptic Gloves31
Fake News, Real Emotions: Emotion Analysis of COVID-19 Infodemic in Weibo30
The Role of Preprocessing for Word Representation Learning in Affective Tasks30
Exploring Emotion Expression Recognition in Older Adults Interacting with a Virtual Coach30
CorMulT: a Semi-supervised Modality Correlation-aware Multimodal Transformer for Sentiment Analysis30
Affective Touch via Haptic Interfaces: A Sequential Indentation Approach30
Emotion Recognition From Few-Channel EEG Signals by Integrating Deep Feature Aggregation and Transfer Learning29
Does Visual Self-Supervision Improve Learning of Speech Representations for Emotion Recognition?29
The Pixels and Sounds of Emotion: General-Purpose Representations of Arousal in Games29
Stimulus-Response Pattern: The Core of Robust Cross-Stimulus Facial Depression Recognition29
Efficient Multimodal Transformer With Dual-Level Feature Restoration for Robust Multimodal Sentiment Analysis28
Towards Emotion-Aware Agents for Improved User Satisfaction and Partner Perception in Negotiation Dialogues28
A Survey of Textual Emotion Recognition and Its Challenges28
Automatic Emotion Recognition in Clinical Scenario: A Systematic Review of Methods28
Behavioral and Physiological Signals-Based Deep Multimodal Approach for Mobile Emotion Recognition28
Short and Long Range Relation Based Spatio-Temporal Transformer for Micro-Expression Recognition28
LineConGraphs: Line Conversation Graphs for Effective Emotion Recognition using Graph Neural Networks27
Multi-Order Networks for Action Unit Detection27
Effects of Physiological Signals in Different Types of Multimodal Sentiment Estimation27
An Effective 3D Text Recurrent Voting Generator for Metaverse27
Deep Facial Action Unit Recognition and Intensity Estimation from Partially Labelled Data27
Learning to Rank Onset-Occurring-Offset Representations for Micro-Expression Recognition26
A Residual Multi-Scale Convolutional Neural Network With Transformers for Speech Emotion Recognition26
Emotion Dependent Domain Adaptation for Speech Driven Affective Facial Feature Synthesis26
Facial Expression Animation by Landmark Guided Residual Module26
Hierarchical Knowledge Stripping for Multimodal Sentiment Analysis26
Collecting Mementos: A Multimodal Dataset for Context-Sensitive Modeling of Affect and Memory Processing in Responses to Videos26
SCARE: A Novel Framework to Enhance Chinese Harmful Memes Detection25
Deep Learning for Micro-Expression Recognition: A Survey25
Mutual Information Based Fusion Model (MIBFM): Mild Depression Recognition Using EEG and Pupil Area Signals25
Social Image–Text Sentiment Classification With Cross-Modal Consistency and Knowledge Distillation25
Affective Dynamics and Cognition During Game-Based Learning25
Emotion Intensity and its Control for Emotional Voice Conversion25
Multi-Modal Sarcasm Detection and Humor Classification in Code-Mixed Conversations25
DECEPTIcON: Bridging Gaps in In-the-Wild Deception Research25
Recognizing, Fast and Slow: Complex Emotion Recognition With Facial Expression Detection and Remote Physiological Measurement25
VyaktitvaNirdharan: Multimodal Assessment of Personality and Trait Emotional Intelligence25
Incorporating Forthcoming Events and Personality Traits in Social Media Based Stress Prediction25
Unconstrained Facial Expression Recognition With No-Reference De-Elements Learning25
Neuro or Symbolic? Fine-Tuned Transformer With Unsupervised LDA Topic Clustering for Text Sentiment Analysis24
Beyond Overfitting: Doubly Adaptive Dropout for Generalizable AU Detection24
Towards Participant-Independent Stress Detection Using Instrumented Peripherals24
Multi-Scale Hyperbolic Contrastive Learning for Cross-Subject EEG Emotion Recognition24
Persuasion-Induced Physiology as Predictor of Persuasion Effectiveness24
A Multi-Level Alignment and Cross-Modal Unified Semantic Graph Refinement Network for Conversational Emotion Recognition24
Detection and Identification of Choking Under Pressure in College Tennis Based Upon Physiological Parameters, Performance Patterns, and Game Statistics23
Unsupervised Time-Aware Sampling Network With Deep Reinforcement Learning for EEG-Based Emotion Recognition23
Examining Emotion Perception Agreement in Live Music Performance23
SynSem-ASTE: An Enhanced Multi-Encoder Network for Aspect Sentiment Triplet Extraction With Syntax and Semantics23
Deep Multi-Task Multi-Label CNN for Effective Facial Attribute Classification22
Detecting Mental Disorders in Social Media Through Emotional Patterns - The Case of Anorexia and Depression22
Conveying Emotions Through Device-Initiated Touch22
Effective Connectivity Based EEG Revealing the Inhibitory Deficits for Distracting Stimuli in Major Depression Disorders22
Multi-Stage Graph Fusion Networks for Major Depressive Disorder Diagnosis21
ENGAGE-DEM: A Model of Engagement of People With Dementia21
Mental Stress Assessment in the Workplace: A Review21
Improving Multi-Label Facial Expression Recognition With Consistent and Distinct Attentions21
CSE-GResNet: A Simple and Highly Efficient Network for Facial Expression Recognition21
Empathetic Conversational Systems: A Review of Current Advances, Gaps, and Opportunities21
EEG-Based Emotion Recognition via Channel-Wise Attention and Self Attention21
Guest Editorial Neurosymbolic AI for Sentiment Analysis21
Analyzing the Visual Road Scene for Driver Stress Estimation21
Autonomic modulations to cardiac dynamics in response to affective touch: Differences between social touch and self-touch21
AT2GRU: A Human Emotion Recognition Model With Mitigated Device Heterogeneity21
MDN: A Deep Maximization-Differentiation Network for Spatio-Temporal Depression Detection21
Facial Expression Recognition With Vision Transformer Using Fused Shifted Windows21
Exploring Multivariate Dynamics of Emotions Through Time-Varying Self-Assessed Arousal and Valence Ratings21
Meta-Based Self-Training and Re-Weighting for Aspect-Based Sentiment Analysis20
Neurofeedback Training With an Electroencephalogram-Based Brain-Computer Interface Enhances Emotion Regulation20
SeeNet: A Soft Emotion Expert and Data Augmentation Method to Enhance Speech Emotion Recognition20
Enhancing EEG-Based Decision-Making Performance Prediction by Maximizing Mutual Information Between Emotion and Decision-Relevant Features20
Aspect-Opinion Correlation Aware and Knowledge-Expansion Few Shot Cross-Domain Sentiment Classification20
From Translation to Generative LLMs: Classification of Code-Mixed Affective Tasks20
From the Lab to the Wild: Affect Modeling Via Privileged Information19
A Reinforcement Learning Based Two-Stage Model for Emotion Cause Pair Extraction19
Editorial19
ER-Chat: A Text-to-Text Open-Domain Dialogue Framework for Emotion Regulation19
Investigating Cardiovascular Activation of Young Adults in Routine Driving19
Towards Multimodal Prediction of Spontaneous Humor: A Novel Dataset and First Results19
Quantitative Personality Predictions From a Brief EEG Recording19
Cross-Task and Cross-Participant Classification of Cognitive Load in an Emergency Simulation Game19
Investigating the Effects of Sleep Conditions on Emotion Responses With EEG Signals and Eye Movements19
RVISA: Reasoning and Verification for Implicit Sentiment Analysis18
Immersion Measurement in Watching Videos Using Eye-tracking Data18
EmoNet: A Transfer Learning Framework for Multi-Corpus Speech Emotion Recognition18
Facial Expression Recognition in the Wild Using Multi-Level Features and Attention Mechanisms18
EEG-Based Emotion Recognition via Neural Architecture Search18
Quantifying Emotional Similarity in Speech18
Editorial: Special Issue on Unobtrusive Physiological Measurement Methods for Affective Applications18
Avatar-Based Feedback in Job Interview Training Impacts Action Identities and Anxiety18
Analyzing Continuous-Time and Sentence-Level Annotations for Speech Emotion Recognition18
An Emotion Recognition Method for Game Evaluation Based on Electroencephalogram18
Can Large Language Models Assess Personality From Asynchronous Video Interviews? A Comprehensive Evaluation of Validity, Reliability, Fairness, and Rating Patterns17
Modeling Vocal Entrainment in Conversational Speech Using Deep Unsupervised Learning17
Unsupervised Multimodal Learning for Dependency-Free Personality Recognition17
Disentangled Variational Autoencoder for Emotion Recognition in Conversations17
A Media-Guided Attentive Graphical Network for Personality Recognition Using Physiology17
DBATES: Dataset for Discerning Benefits of Audio, Textual, and Facial Expression Features in Competitive Debate Speeches16
Conformal Depression Prediction16
Aspect-Based Sentiment Analysis with New Target Representation and Dependency Attention16
A Multi-Stage Visual Perception Approach for Image Emotion Analysis16
The ForDigitStress Dataset: A Multi-Modal Dataset for Automatic Stress Recognition16
Towards A Robust Group-level Emotion Recognition via Uncertainty-Aware Learning16
Chronic Stress Recognition Based on Time-Slot Analysis of Ambulatory Electrocardiogram and Tri-Axial Acceleration16
Encoding Syntactic Information into Transformers for Aspect-Based Sentiment Triplet Extraction16
A Multimodal Non-Intrusive Stress Monitoring From the Pleasure-Arousal Emotional Dimensions16
Deep Temporal Analysis for Non-Acted Body Affect Recognition16
Multimodal Hierarchical Attention Neural Network: Looking for Candidates Behaviour Which Impact Recruiter's Decision16
From Extraction to Generation: Multimodal Emotion-Cause Pair Generation in Conversations16
Eye Action Units as Combinations of Discrete Eye Behaviors for Wearable Mental State Analysis16
Context-Aware Dynamic Word Embeddings for Aspect Term Extraction16
Graph-Based Facial Affect Analysis: A Review15
Silicon Coppélia and the Formalization of the Affective Process15
Deep Siamese Neural Networks for Facial Expression Recognition in the Wild15
Fine-Grained Interpretability for EEG Emotion Recognition: Concat-Aided Grad-CAM and Systematic Brain Functional Network15
Machiavellian Robots and Their Theory of Mind15
Beyond Mobile Apps: A Survey of Technologies for Mental Well-Being15
Human Emotion Recognition With Relational Region-Level Analysis15
EEG-Based Subject-Independent Emotion Recognition Using Gated Recurrent Unit and Minimum Class Confusion15
Tracking Dynamic Flow: Decoding Flow Fluctuations Through Performance in a Fine Motor Control Task14
Facial Action Unit Detection Using Attention and Relation Learning14
Altered Brain Dynamics and Their Ability for Major Depression Detection Using EEG Microstates Analysis14
Integrating Visual Context Into Language Models for Situated Social Conversation Starters14
A Systematic Review of Experimental Protocols: Towards a Uniform Framework in Virtual Reality Affective Research14
Mobile Virtual Assistant for Multi-Modal Depression-Level Stratification14
Modeling Multiple Temporal Scales of Full-Body Movements for Emotion Classification14
0.087727069854736