Exploring Emotion Processing in the Human Brain through Positive and Negative Affect-inducing GIFs

1 Department of Clinical Psychology, Central Institute of Mental Health, Medical Faculty Mannheim, University of Heidelberg, Mannheim, Germany
2 Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Medical Faculty Mannheim, University of Heidelberg, Mannheim, Germany
3 Department of Addiction Behaviour and Addiction Medicine, Central Institute of Mental Health, Medical Faculty Mannheim, University of Heidelberg, Mannheim, Germany
4 Institute of Psychology, Faculty of Behavioural and Cultural Studies, University of Heidelberg, Heidelberg, Germany
5 Heidelberg Academy of Sciences and Humanities, Heidelberg, Germany
6 Institute of Cognitive Science, Osnabrück University, Osnabrück, Germany
Psychology & Brain 2024


We aim to establish a machine learning algorithm capable of
classifying positive and negative affect markers in brain signals of humans

Poster Abstract

Emotions are thought to influence activity in human brain areas that control decisions, direct attention, and motivate behavior in our surrounding world. Indeed, affective neuroscience has long since attempted to explore the underlying mechanisms of emotion processing that are interlocked with perception, cognition, motivation, and action in the brain.[1] However, the organization of the anatomical and functional neural networks that overall form our emotion processing architecture are yet to be fully elucidated.
This study aimed to investigate this by measuring the positive and negative affect of a set of short video clips (GIFs), which have previously been validated and classified by thousands of participants into 27 distinct emotion categories[2], using Magnetoencephalography (MEG). Healthy participants are placed in the MEG scanner and subjected to a behavioral psychophysics task, in which a total of 144 positive or negative affect-inducing GIFs are shown in a randomized order and subsequently rated on a 5-point scale of the valence and arousal dimensions in each trial. As a next step (currently in progress), the obtained MEG signals will be analyzed using machine learning algorithms to extract neural markers of positive and negative affect processing.
These preliminary findings could potentially lead to an improved understanding of the neural networks involved in emotion processing and furthermore facilitate the development of novel translational approaches against affective disorders such as major depression and bipolar disorder, along with methods to detect affect processing in the absence of behavioral input in cases such as sleep or resting state memory consolidation.

Experiment





An example of an experiment round

In each trial, a red fixation cross first appears at the center of one of four quadrants on the screen to draw the attention of the participant to that part of the screen.
One of the 144 GIFs then plays out and the participant pushes a button on a controller to indicate the moment they felt an emotion.
The participant subsequently rates how positive or negative the emotion they felt was on a 5-point Valence scale, and how calm or excited the emotion they felt was on a 5-point Arousal scale.
Finally, the participant performs a flanker task in which four arrows appear at four possible locations (north, south, east, or west) in a randomized order and they press buttons on the controller to indicate the direction the arrow is pointed towards.

Data Analysis


Preprocessing


The MEG data went through the following preprocessing steps:
  • - Highpass-, lowpass- and notch-filtering
  • - Event extraction and epoching
  • - Autorejection of bad epochs
  • - ICA with rejection of ECG and EOG related signals

Additionally, as recent literature suggests only minimal preprocessing[3], a second version has been obtained with only highpassed epochs. Despite a bigger confidence interval for minimal preprocessed data, no difference in classification accuracy could be found.


Classificaton steps


First, we establish a classification pipeline. Therefore, we use and easy-to-test target, as decoding emotions will introduce more possible error sources. Here, we decode in which of 4 corners the GIF was shown. As a machine learning classifier we use Logistic Regression and Random Forest, which yield similar results. The figure shows, at which timepoint it is possible to accurately classify the GIF's position.
Next, different features of the MEG signal were extracted to see how much they contribute to a correct classification. When extracting delta, theta, alpha, and beta frequency bands, our classification results solely based on these bands yield a peak accuracy of 33%.
After the pipeline has been tested and established, the target will be changed to affect related items such as valence and arousal. Additionally, different features will be tested to see how much they contribute.

steps the data goes through from raw to bein classified



Discussion

In this project, we aim to find a classifier that can successfully decode emotions. The preliminary findings could potentially lead to an improved understanding of the neural networks that are involved in emotion processing in humans. This would open up opportunities to facilitate the progress of novel translational approaches against increasingly relevant affective disorders such as major depression and bipolar disorder. Simultaneously, the development of methods that detect emotion affect processing in the absence of behavioral input in cases such as when one is asleep or when the memory consolidation process occurs during resting state.


Poster

Literature


[1] Brosch, T., Scherer, K., Grandjean, D., & Sander, D. (2013). The impact of emotion on perception, attention, memory, and decision-making. Swiss Medical Weekly, 143(1920), w13786. https://doi.org/10.4414/smw.2013.13786
[2] Cowen, A. S., & Keltner, D. (2017). Self-report captures 27 distinct categories of emotion bridged by continuous gradients. Proceedings of the National Academy of Sciences, 114(38), E7900 – E7909. https://doi.org/10.1073/pnas.1702247114
[3] Delorme, A. (2023b). EEG is better left alone. Scientific Reports, 13(1). https://doi.org/10.1038/s41598-023-27528-0