Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Perspective-taking is associated with increased discriminability of affective states in the ventromedial prefrontal cortex

Perspective-taking is associated with increased discriminability of affective states in the... Recent work using multivariate-pattern analysis (MVPA) on functional magnetic resonance imaging (fMRI) data has found that dis- tinct affective states produce correspondingly distinct patterns of neural activity in the cerebral cortex. However, it is unclear whether individual differences in the distinctiveness of neural patterns evoked by affective stimuli underlie empathic abilities such as perspective-taking (PT). Accordingly, we examined whether we could predict PT tendency from the classification of blood-oxygen- level-dependent (BOLD) fMRI activation patterns while participants (n=57) imagined themselves in affectively charged scenarios. We used an MVPA searchlight analysis to map where in the brain activity patterns permitted the classification of four affective states: happiness, sadness, fearanddisgust. Classificationaccuracywassignificantlyabovechancelevelsinmostofthe prefrontalcortexand in the posterior medial cortices. Furthermore, participants’ self-reported PT was positively associated with classification accuracy in the ventromedial prefrontal cortex and insula. This finding has implications for understanding affective processing in the prefrontal cortex and for interpreting the cognitive significance of classifiable affective brain states. Our multivariate approach suggests that PT ability may rely on the grain of internally simulated affective representations rather than simply the global strength. Key words: emotion; perspective-taking; multivariate-pattern analysis; ventromedial prefrontal cortex Manstead, 2015; Fotopoulou and Tsakiris, 2017; Dukes et al., Introduction 2021). Empathy is a multifaceted construct combining cogni- Contemporary neuroscience research has highlighted the com- tive processes that allow us to understand the internal states plex relationship between neural activity and affective states. of others and affective processes that allow us to share in the We define ‘affective states’ to include both ‘emotions’, which in internal states of others. These include aversive reactions to oth- ourviewcomprisephysiologicalandmotoricresponsestostimuli ers’ distress [personal distress (PD)], concern for others’ welfare in the environment that are relevant to the homeostatic wel- [empathic concern (EC)], feeling and understanding the expe- fare of the organism, and ‘feelings’, the conscious perceptions of riences of hypothetical or absent others (fantasizing) and tak- emotion-related changes in the body. Affective states appear to ing others’ perspectives [perspective-taking (PT)] (Davis, 1983). involve interactions between cortical and sub-cortical regions, as Researchhasfoundthatsystemsinvolvedinunderstandingone’s wellwiththeviscera(forreview:Koberetal.,2008;Critchley,2009; own emotions are also involved in understanding the affec- Tettamanti et al., 2012; Damasio and Carvalho, 2013; Smith and tive states of others (Ochsner et al., 2004). Empathizing with Lane, 2015; Vaccaro et al., 2020). another’s feelings recruits affective brain regions involved in rep- Recently, our understanding of emotion has progressed from resenting one’s own affective state (Singer et al., 2004; Lamm considering solely the experience and mechanisms of individ- et al., 2016) and there is evidence that impaired affective expe- ual experience, understanding that the consideration of other’s rience (as in psychopathy) may limit empathic abilities (Blair minds plays a large role in one’s own affective experience et al., 2002). For example, participants administered an analgesic (for review: Decety and Meltzoff, 2011; Christov-Moore and were impaired in their ability to recognize and respond to oth- Iacoboni, 2016; Lamm et al., 2016). While the core mechanisms ers’ pain (Mischkowski et al., 2019). Furthermore, it has been of affect were once viewed and researched as a primarily pri- found that placebo analgesia reduces both pain and empathy for vate, intraindividual process, there is a growing consensus that pain (Rütgen et al., 2015), as well as both unpleasant touch and the development of affect is inescapably linked to sociality: this empathy for it through modulation of the insular cortex (Rütgen places empathetic processes as more core to individual affect et al., 2021). than they were originally considered (for review: Parkinson and Received: 4 March 2021; Revised: 5 April 2022; Accepted: 16 May 2022 © The Author(s) 2022. Published by Oxford University Press. This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial License (https://creativecommons.org/licenses/by-nc/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited. For commercial re-use, please contact journals.permissions@oup.com A. G. Vaccaro et al. 1083 One feature of affective experience that is relevant to both the ability or motivation to deliberately simulate affective states empathizing with others and representing one’s own state is the fromconceptsissimilartowhathasbeenproposedinthesomatic extent of differentiation among affective states. Previous studies marker hypothesis for the vmPFC, generating feelings ‘as-if’ one have shown that individuals who are more successful at judg- is in a scenario (Damasio, 1996). It is possible that this individ- ing the affective states of others experience more differentiated ual variability presents itself in the distinctiveness of the neural categories of affect (Erbas et al., 2016; Israelashvili et al., 2019). states evoked in response to different cues. In line with previous Having, and being able to simulate, affective states that are cat- work on the overlap between empathy and the representation of egorically discernable may facilitate skills such as mentalizing, one’s own affective states, a trait level measure of empathy may empathy and PT due to the perceived increase in clarity of what be associated with these differences. one is feeling and therefore what that feeling means function- In our study, participants underwent an affect induction ally (Hill and Updegraff, 2012; Eckland et al., 2018; Thompson and paradigm in which they viewed pictures of situations invoking Boden, 2019). Therefore, we hypothesize that increased neural fear, happiness, sadness and disgust, alongside captions describ- differentiation of affective states may be associated with greater ing the scenario from a first-person perspective. It is likely that empathy. this type of paradigm evokes both emotions and feelings; thus, Recent studies using the multivariate-pattern analysis (MVPA) it allows us to investigate the correlates of affective states as a find that distinct affective states may be associated with spe- whole but not to differentiate between neural patterns for emo- cific patterns of neural activity within a network of brain regions tion and for feeling. We ran two sets of MVPA analyses on the (Scarantino, 2012; Celeghin et al., 2017; Nummenmaa and evoked neural data to investigate the classification accuracy of Saarimaki, 2019). MVPA studies have demonstrated that discrete, emotions from patterns of the fMRI activity. In the first, we induced affective states can be accurately distinguished from examined which regions’ activation was most informative for each other (i.e. classified) using patterns of BOLD activation in classifying the four evoked affectivestates. Wehypothesized that functionalmagneticresonanceimaging(fMRI)data(Kassametal., the mPFC would have the highest classification accuracy. In the 2013; Kragel and LaBar, 2015; Saarimaki et al., 2016; Zhou et al., second analysis, we attempted to predict individual differences 2021). The most commonly studied of these states in MVPA stud- in empathic ability from the classification accuracy of partici- ies are sadness, disgust, fear, happiness and, to a lesser extent, pants’ patterns of neural activation during emotion induction. anger, all classically considered ‘basic emotions’ (Saarimaki et al., We hypothesized that individual differences in empathic abil- 2016; Celeghin et al., 2017). However, other more subtle affective ity would be reflected in the distinctiveness of neural patterns states have also been studied, such as shame, envy, contempt, of activation evoked by different emotions. Given the mPFC’s pride, guilt and longing (Kassam et al., 2013; Kragel and LaBar, prominence in MVPA studies of emotions, as well as its role 2015; Saarimaki et al., 2018), although these may be less easily in affective PT, we hypothesized that classification accuracy for classified than their more ‘basic’ cousins ( Saarimaki et al., 2018). emotions in this region would show the highest correspondence Cortical regions found to contribute most to classification with empathic abilities. Note that our predictions concerned accuracy in MVPA studies tend to be consistent with those found empathic ability in general; since we did not hypothesize which in univariate analyses of affective processing. These regions specificcomponentsofempathywouldcorrelatewiththedistinc- include the medial prefrontal cortex (mPFC), inferior frontal tiveness of neural patterns, our analysis was exploratory with gyrus, posterior medial cortex, insula and amygdalae (Peelen respect to the empathy sub-scales. et al., 2010; Kim et al., 2015; Saarimaki et al., 2016, 2018; Sachs et al., 2018). A key region involved in both judging another’s Methods affective state and representing one’s own affective state is the mPFC (Seitz et al., 2006). Specifically, the ventral areas of mPFC Healthy adult participants (n=57) were recruited as part of have been shown to play a selective role in affective PT as com- two different studies, all recruited through flyers from the Uni- pared to cognitive PT (or general theory of mind; Hynes et al., versity of Southern California and surrounding Los Angeles 2006;Corradi-Dell’Acquaetal.,2014;HealeyandGrossman,2018). area. Thirty-six participants’ data (18 female, age=24.21±8.68, Interestingly, MVPA results have further suggested that some range=18–52) were collected in the first study. In a second of the mechanisms involved in representing one’s own affective study, 21 more participants’ data (11 female, age=22.67±6.45, state overlap with the mechanisms for empathy. The insula has range=18–42) were collected. Since the two studies used the been shown to have shared neural representations for pain and same experimental paradigm and stimuli, with slight differences empathy for pain (Zhou et al., 2020). detailed below, the data were combined to increase statisti- Empathy is often an implicit part of paradigms used to study cal power. All participants were right-handed, had normal or emotion differentiation. In order to experimentally invoke affec- corrected-to-normal vision, no history of neurological or psy- tive states inside the fMRI scanner, it is common to present chiatric conditions, All participants gave informed consent in subjects with affect-provoking stimuli and also to engage sub- accordance with the institutional review board approval guide- jectsinvoluntarymentalsimulation.Forinstance,Saarimakietal. lines approved by the University of Southern California. Because (2018) used narratives that describe the lead-up to an emotional thesecondstudyinvolvedamorecomprehensivebatteryofaddi- event along with a guided imagery technique to evoke 14 differ- tionalbehavioralmeasuresnotusedintheanalysesofthispaper, ent affective states. It can be difficult or impractical to design behavioral data for those participants was collected on a dif- stimuli that effectively induce genuine affect. Tasks often involve ferent day. For this reason, 2 of the 21 participants did not explicitlyaskingparticipantstoimaginethemselvesinemotional provide behavioral data collected before the university shutdown scenarios based on visual or audio imagery. Paradigms such as due to coronavirus disease (COVID-19), leaving us with 19 par- this require participants to access their concepts of emotion. ticipants from this second study for analyses relating to the Interestingly, this naturally creates individual variability where empathy measures (9 female, age=22.32±6.14, range=18–42): someindividualscaneasilygeneratestrongfeelingsfromretriev- a total of 55 participants between the two studies for this second ing emotional concepts while others cannot. This difference in analysis. 1084 Social Cognitive and Affective Neuroscience, 2022, Vol. 17, No. 12 Interpersonal reactivity index the described emotional situation as stronglyas possible for each stimulus. BehavioralmeasuresofempathywereacquiredthroughtheInter- personal Reactivity Index (Davis, 1983). This self-report measure fMRI data acquisition consists of four seven-item sub-scales: (i) PT: the ability of the All scanning was completed on a 3T Siemens Prisma System participant to take on the point of view of another individual Scanner at the USC Dornsife Cognitive Neuroimaging Center (for example: ‘Before criticizing somebody, I try to imagine how I using a 32-channel head coil. Anatomical images were acquired wouldfeelifIwereintheirplace’),(ii)fantasy(FS):thetendencyof with a T1-weighted magnetization-prepared rapid gradient-echo the participant to identify with fictitious characters (for example: sequence (repetition time [TR]/echo time [TE]=2300/2.26, voxel ‘I really get involved with the feelings of characters in a novel’), size 1-mm isotropic voxels, flip angle 9 ). Functional images were (iii) EC: the presence of the participant’s feeling of compassion acquired with a T2*-weighted gradient-echo sequence (repeti- or concern for others (for example: ‘I am often quite touched tion time [TR]/echo time [TE]= 2000/25ms, 41 transverse 3-mm by things I see happen’ and (iv) PD: the presence of the partic- slices, flip angle 90 ). A T2-weighted volume was acquired for ipant’s feeling of discomfort or anxiety for others (for example: blind review by an independent neuroradiologist, in compliance ‘When I see someone who badly needs help in an emergency, I with the scanning center’s policy and local Institutional Review go to pieces’; Davis, 1983). For each participant, each sub-scale Board guidelines. T2-weighted scans were not analyzed by the scorewasassessedseparately,resultinginfourdistinctscoresper researchers for any purpose in this study. participant. Stimuli fMRI analysis Stimuli were presented as one photo in the center of the screen Preprocessing and GLM with anecdotal descriptive text underneath each photo. Photos Data were first processed using fMRI expert analysis tool, FMRIB were first gathered from a subset of images in the International Software Library (FSL)’s implementation of the General Linear Affective Pictures Set (IAPS; Bradley and Lang, 2007) covering Model (GLM; FMRIB Software Library, Smith et al., 2004) to gen- the affective categories of happiness, fear, sadness and disgust. erate voxel-wise z-statistic maps showing voxels that responded Text sentences from the Affective Norms for English Text (ANET; significantly to each emotion type for each participant. Those BradleyandLang,2007)werealsochoseninthesefourcategories. z-statistic maps were then used for the classification analy- Stimulicaptionsarewritteninthesecondperson, tellingthesub- sis. Data preprocessing was conducted in FSL (FMRIB Software ject what they were experiencing (example: ‘As you leave the Library, Smith et al., 2004) using brain extraction, slice-time and concert, a drunk vomits all over your jacket, soaking it.’). Pic- motion correction using motion-correction fMRIB’s linear regis- tures from the IAPS were then matched with a corresponding tration tool, spatial smoothing (5mm) and high-pass temporal piece of text from the ANET that described a situation associ- filtering (sigma =50s). The functional data were registered to ated with the picture. For example, a picture of a snarling dog eachparticipant’sownanatomicalimageandtheanatomicaldata wascombinedwiththecaption‘Thedogstrainsforward,snarling wereregisteredtothestandardMNIBrain(MontrealNeurological and suddenly leaps out at you’ (See Supplementary Table S1 for Institute) using FSL’s fMRIB’s non-linear registration tool (FNIRT) examples). tool (Jenkinson and Smith, 2001). The data were modeled with a For pictures that did not have appropriately matching text separate regressor for each of the four emotions (happy, sad, fear from ANET or text that did not have appropriately matching and disgust), one for the neutral condition, the temporal deriva- images from the IAPS, text/images were written to fit or acquired tives of all task regressors and six motion parameters to account from the web. These new images were rated for valence and forresidualmotioneffects.Thesesamesmoothedstandardspace arousal by 51 participants in an earlier study (Supplementary z-maps were used for both the emotion discrimination analysis Tables S2 and S3). Subjects were also asked to indicate what and for the individual subject searchlights. category of affective state each photo/text combination cor- responded to. For every stimulus selected for the study, the MVPA analysis expected category (among happy, sad, fear, disgust and neutral) Emotion discrimination analysis was the most commonly picked category by the subjects (see Supplementary Figure S1). In addition to these emotional stim- All MPVA analyses were conducted using PyMVPA (Hanke et al., uli, non-emotional/neutral stimuli were used as a control and a 2009). A whole-brain searchlight analysis was conducted to iden- fixation cross was used as a rest. Since our goal was to predict tify regions whose activation patterns allowed us to classify the emotionalstateswithMVPA,theanalysisoftheneutralimagesis four emotions across all subjects’ data. The input data to the not presented here. classifier was a single 4D image file combining z-stat maps (nor- malized to MNI standard space using FNIRT) for each affective state, functional run and participant (resulting in a total of 828 Functional neuroimaging: fMRI concatenated images—36 participants × 4 runs × 4 emotions for fMRI design study 1 combined with 21 participants × 3 runs × 4 emotions for Stimuliwerepresentedinanevent-relateddesignusingMATLAB’s study 2). For every voxel in the brain, a sphere centered on that Psychtoolbox. In study 1, 60 stimuli (photo+text) were randomly voxel (radius=5 voxels) was used to train and test a linear sup- presented during 4 functional runs (15 stimuli per run, 6min per portvectormachine(SVM)usingaleave-one-outcross-validation. run). Each stimulus was presented for 12s followed by a 12s fix- In other words, in each iteration, the classifier was trained on all ation cross in between each trial as a ‘rest’ period. In study 2, the participants’ data except one and then tested on the remain- 45 stimuli were randomly presented during 3 functional runs. ing participant’s data leading to 55 cross-validation folds. The In both studies, participants were instructed to lay still, observe resulting average accuracy over all iterations, after leaving each the displayed photograph, read the text and attempt to embody participant out once, was mapped to the center voxel of the A. G. Vaccaro et al. 1085 sphere, ultimately resulting in a cross-participant map of clas- comparisons in the correlation analysis, we first performed a sification accuracies for every voxel. For the SVM regularization resel-wise Bonferroni correction: we determined the total num- parameter C, we used the default in PyMVPA, which chooses this ber of five voxel radius independent spheres, which could fit in parameter automatically according to the norm of the data. the standard brain (∼725) and divided 0.05 by this number to −5 determineouralpha(6.89×10 )(KaplanandMeyer,2012).There- fore, a classification accuracy for which greater than or equal Empathy correlation analysis values appeared in our distribution less than 10 times would To correlate the scales of the Interpersonal Reactivity Index (IRI) be considered significantly above chance. In our simulated null with individual classification accuracy, we first computed whole- distribution, where there were four emotions and an expected brainsearchlights‘within’everyindividualparticipant’sdata. We chance accuracy of 0.25, a significant above chance classification ran searchlights on each individual subject’s data in standard accuracy was determined to be>0.30592. The maximum value space. For each participant, a sphere (radius=3 voxels) cen- found in our null distribution was 0.311 (Supplementary Figure tered on every voxel in the brain was used to iteratively assess S2). For the second correction method in both analyses, we used classification accuracy throughout the brain. For each sphere, a voxel-wise Bonferroni correction (0.05/n tests) and determined the SVM classifier was trained on all but one of the emotions’ thecorrespondingaccuracyvalueusingthebinomialdistribution. functional scanning runs and tested on the left out run. The Withthismethod,weestablishedthataBonferroni-correctedsig- resulting accuracy values were mapped to the center voxel of the nificance of 0.05 would correspond to an accuracy threshold of sphereandresultingwhole-brainvoxel-wiseaccuracymapswere 0.3815. To enhance the replicability of our findings and isolate warped into the standard space. This created an accuracy map the most informative regions, we opted in both analyses to use for each participant where a voxel’s value represented the clas- the more conservative threshold (permutation-corrected results sification accuracy of the three voxel spheres surrounding that can be found in the appendix: Supplementary Figure S3). This voxel. We chose a smaller sphere size than the between-subjects thresholdisextremelyconservativegiventhatitisbasedonafull analysis to increase our spatial resolution. Furthermore, these voxel-wise Bonferroni correction and is also greater than any of individual subject searchlights required significantly less com- the classification results in our over 150000 permutations. puting power than the between-subjects analysis, so we were In post-hoc analyses, we wanted to determine if our results in less restricted by computing limitations. To identify relation- theregionsthatsignificantlydistinguishedbetweenthefouremo- ships between these individual participant searchlight maps and tions were driven by one emotion being uniquely distinguishable individual differences in self-reported empathy, we used FSL’s compared to the others. To do this, we ran pair-wise searchlight Randomise tool (with FWE correction). We created a series of analyses classifying each of six possible pairs of emotions. If one regressors using participants’ demeaned scores on each of the specificemotionwasdrivingourfindings,onlypairswiththatone IRI’s sub-scales (PT, EC, PD and FS). These regressors were then emotionwouldshowasimilarspatialpatternofsignificantclassi- relatedtothesearchlightaccuracyvaluesusingRandomise’snon- ficationtoourinitialfour-waysearchlightemotiondiscrimination parametric permutation testing approach (Winkler et al., 2014). analysis. At each voxel, the accuracies across subjects are randomly asso- ciated with the regressor-values (the empathy sub-scales) and a teststatisticiscomputed.Thispermutationprocedureisrepeated Results 5000 times to generate a null distribution. One of the result- Maps of the statistical results for this study can be found on ing maps is then a map of 1 minus the P-value for the asso- the Open Science Foundation website for this study, https://osf. ciation between the regressor and the classification accuracy, io/8wcnz/. determined by comparison to the null distribution. These maps were corrected for family-wise error using the FSL’s threshold- Affect discrimination searchlight free cluster enhancement algorithm. We restricted our interpre- Multiple regions significantly predicted affect classification, tation of the resulting maps to regions that significantly pre- even using the more conservative (voxel-wise Bonferroni- dicted emotions in the group searchlight analysis by masking corrected) threshold (compared with the four-way these maps with the regions found to significantly distinguish chancelevelof0.25)(SeeFigure1).TheseincludedvmPFC(x=−9, affective states at the group level in the initial affect discrimi- y=55, z=−16; accuracy=0.444), anterior prefrontal cortex nation analysis. Therefore, the final resulting map shows voxels (x= −9, y=55, z=12; accuracy=0.436), dorsomedial prefrontal where both (i) classification was above chance across all sub- cortex (x=14, y=36, z=46; accuracy=0.419), bilateral insula jects and (ii) classification varied significantly with the empathy (left x=−40, y=−11, z=9; accuracy=0.441, right x=44, scores. y=−6, z=8; accuracy=0.420), bilateral amygdala (left x=−28, y=−10, z= −12; accuracy=0.412; right x=26, y=−10, z= Statistical thresholding −12; accuracy=0.417), posterior cingulate cortex (x=−2, y=−52, To determine an appropriate threshold for significance, we z=14; accuracy=0.425), bilateral temporal gyrus (x=−38, employed two parallel methods to correct for multiple com- y=−54, z=−8; accuracy=0.446) (x=33, y=−43, z=−10; accu- parisons in each of our analyses. For the affect discrimination racy=0.448) and bilateral superior parietal lobule (left x=−32, analysis, we first used permutation testing to create a null distri- y=−69, z=37; accuracy=0.428; right x=33, y=−63, z=30; bution of classification accuracies within a simulated searchlight accuracy=0.426). sphere by shuffling affect labels. We ran 151801 permutations These spatial patterns, especially the high classification val- of the classification within a single-sample searchlight sphere. ues in medial frontal and parietal areas, were mirrored in all This allowed us to create a distribution of classification values the post hoc pair-wise classification searchlights except happy vs that might occur in a given searchlight sphere assuming the sad, which did not reach a significant threshold of 0.68 accuracy null hypothesis that the four affect conditions cannot be dis- (see Supplementary Figures S4–S8). This suggests that our four- tinguished from patterns of activation. To account for multiple way classification results did not result from one emotion being 1086 Social Cognitive and Affective Neuroscience, 2022, Vol. 17, No. 12 often ‘black-box’ like scenarios. We demonstrate that an individ- ual trait may be related to differences in the distinguishability of neural patterns corresponding to affective states. This method provides a new way of exploring and theorizing why neural pat- terns can be distinguished in this way and demonstrates a new avenue of inference for testing how personal traits and cognitive functions might be associated with how different neural states are discernible from each other. In this case, we found that self- reported PT is related to the discernibility of the affective states our participants were asked to imagine. Our study has implica- tions for both the use of MVPA in general and our understanding of how empathetic traits relate to affect. Relating MVPA performance to individual Fig. 1. Brain regions in which the searchlight analysis significantly differences predicted the four affect categories. After Bonferonni correction for It has long been recognized that connecting an MVPA result every voxel in the brain, a corrected P-value of 0.05 corresponded to a directly to some measured aspect of the underlying behavior or classification accuracy threshold of ∼0.38. psychologybeing studiedisimportant (RaizadaandKriegeskorte, 2010; Naselarisetal., 2011). Otherwise, successfullydecodedneu- ral signals may reflect information that is either not used at all by the brain or is not relevant to aspects of the psychology in question (Kriegeskorte and Douglas, 2019; Ritchie et al., 2019). Nevertheless,itisuncommonforMVPAstudiestorelateclassifier performance to individual differences. Using procedures similar toours,Coutancheetal. (2011)foundthattheMVPAclassification ofvisualstimulicorrelatedwithAutismSpectrumDisordersymp- tomseverityacrosssubjects.Studiesintheauditorydomainhave found that MVPA classification accuracy in a musical instrument identification task ( Ogg et al., 2019) and a speaker identifica- tion task (Bonte et al., 2014; Aglieri et al., 2021) correlate with between-subject differences in accuracy on those tasks. Raizada et al. (2010) found that the performance of a classifier at distin- guishing neural patterns related to the perception of different phonemeswasrelatedtosubjects’performanceindiscriminating Fig. 2. Brain regions with significant classification accuracy where PT on the same phonemes and that the separability of neural patterns the Interpersonal Reactivity Index significantly predicted classifier accuracy. during a numerical discrimination task was related to arithmetic performance. Similarly, Meyer et al. (2010) showed that classifier discrimination of imagined auditory stimuli was correlated with the reported vividness of imagination. distinguishable compared to the rest, or one pair of emotions Yet, while linking MVPA performance to individual differences beingespeciallydistinguishablecomparedtoothercombinations. can make headway in showing the psychological relevance of classification performance, many of the limitations that exist Classification accuracy and IRI measures in other studies attempting to relate individual differences to See Supplementary Table S4 and Supplementary Figure S9 for the neuroimaging analysis still remain. For example, to opti- descriptive statistics and distributions of sub-scales. FS, PD and mize studying individual differences, the states induced should ECwerenotsignificantlyrelatedtoclassificationaccuracy.PTwas produce enough variability across individuals while still main- significantly related to classification accuracy ( P <0.05 with FWE taininghighidentifiabilityofstimulusrepresentations( Finnetal., correction) in a large area of vmPFC (x=−15, y=48, z=−7), as 2017).Intermsofidentifiabilityofneuralpatterns,MVPAtendsto well as in bilateral, although predominantly left, insula (x=−39, be more sensitive than the univariate analysis. However, it also y=−7, z=7) (x=43, y=−5, z=−6; see Figures 2 and 3). tends to be less sensitive to between-subject variability in mean activation levels (Davis et al., 2014). It remains to be determined whether the variability across people in classifier performance Discussion is optimal for the study of individual differences. Furthermore, MVPA has become a popular method in affective neuroscience while the trait relevance of classifier performance is informative, over the past decade (Coutanche, 2013; Morawetz et al., 2016; interpretation must always be careful in that such relationships Oosterwijk et al., 2017). While it has been demonstrated repeat- can always be mediated by additional, unmeasured variables. edly that neural patterns associated with affective states can be distinguished from each other, the interpretation of what these Decoding of affective states correlates with PT patterns tell us about categories, and how they come to be, is contentious (for review: Coutanche, 2013; Kragel and LaBar, 2016; We found multiple brain regions whose activity permitted us to Clark-Polner et al., 2017; Gessell et al., 2021). In our study, we robustly classify evoked affective states. Many of these brain usedanovelapproachtoincreaseourpowerofinferenceinthese regions have allowed successful classification of affect in prior A. G. Vaccaro et al. 1087 Fig. 3. Scatterplots of PT vs classification accuracy in the ventromedial prefrontal cortex and insula. studies: the posterior cingulate, insula, temporal gyrus, medial associated with PT showed little overlap with regions typically prefrontal and ventral medial prefrontal regions have all been implicated in univariate studies of PT: regions such as visual cor- implicated in previous MVPA studies of emotions and affective tex,temporal–parietaljunctionanddorsolateralprefrontalcortex context (Skerry and Saxe, 2014; Saarimaki et al., 2016; Oosterwijk (Decety and Grèzes, 2006; Bukowski, 2018; Healey and Grossman, et al., 2017; Bush et al., 2018; Paquette et al., 2018; Sachs et al., 2018). Accuracy in these areas did not correlate with our PT mea- 2018). sure. Additionally, even though it is conceivable that our visual ThevmPFChaslongbeenimplicatedinimplementingemotion and descriptive prompts would support a largely visual simula- and feeling (for review: Bechara et al., 1994, 1999; Roy et al., 2012; tion of the scene or that improved affective representation might Winecoff et al., 2013). A theoretical link between affect and the result from increased attention, neither primary visual regions vmPFC is provided by the ‘somatic marker hypothesis’ (Damasio, nor parietal and temporal visuospatial regions provided signifi- 1996). ThisviewpositsthatthevmPFCisakeyregionforincorpo- cant classification accuracy. Visual, temporal–parietal and dorsal rating emotion-related signals from the body, consciously expe- prefrontal regions may be important for the general process of rienced as feelings, into our cognitive decision-making process. PT but more related to the general effort involved in perform- According to the SMH, the vmPFC also permits us to bypass pure ing the task rather than its success. In a study where individuals bodily input in order to simulate these feelings even without the wereinstructedtotaketheperspectivethatanimageofapainful direct presence of a trigger (Poppa and Bechara, 2018). By this stimulus was occurring to either themselves or another person account, the vmPFC could serve to ‘simulate’ the affective expe- (van der Heiden et al., 2013), the main effects of the different PT rience being evoked (Keysers and Gazzola, 2007; Schacter et al., conditions were evident in regions such as supramarginal gyrus, 2017). Indeed, studies on affective PT vs cognitive PT have impli- temporal gyrus, frontal gyrus and ventrolateral PFC. However, cated the vmPFC (Sebastian et al., 2012; Healey and Grossman, whentheeffectsofgoodvsbadperspective-takerswereanalyzed, 2018). The vmPFC has also been implicated in affective simula- alltheseregionsexceptventrolateralPFCwerenotsignificantpre- tion in studies in which participants imagined affective contexts dictors, and instead the left insula, postcentral gyrus and vmPFC thatcouldhappentotheminthefuture(D’Argembeauetal.,2008) differed between the groups. Perhaps the insula and vmPFC are orthathavehappenedinthepast(Benoitetal.,2016,2019).Corre- specifically important for generating a fine-grained affective sim- spondingly,lesionstovmPFCappeartohamperboththeabilityto ulation, and this granularity is reflected in more distinct neural simulate future affective scenarios and imaginary ones (Bertossi patterns. et al., 2016, 2017). We note also that while the IRI includes items that ask partici- Importantly, we show here that the vmPFC’s role in affec- pantstojudgetheireaseordifficultyatempathizing(‘Isometimes tive PT is related not just to the level of the neural activity find it difficult to see things from the “other guy’s” point of view’.) invoked, but to the distinctiveness of its affect-related neural many of the questions are instead about the ‘tendency’ to take patterns. This could suggest that affective PT may relate to the someone else’s perspective (‘I sometimes try to understand my simulation of the affective experience that is more specific to friends better by imagining how things look from their perspec- the target emotion, reflected functionally in more easily clas- tive’.) or the ‘motivation’ to do so (‘I try to look at everybody’s sifiable representations of affective context. If simulated affec- side of a disagreement before I make a decision’.). As such, we tive states are more generalized and overlapping, as opposed to cannot distinguish which aspects of PT are directly related to detailed and nuanced, their associated neural patterns can be increased classifier accuracy. It is possible that participants who expected to be less differentiable. Perceived emotional actions, scorehigherontheIRI-PTaremotivatedtoengageinthetaskwith interoception and affective contexts of other individuals can all moreeffortandthattheincreasedpatternseparationisrelatedto be successfully classified using the vmPFC activity ( Oosterwijk increased effort in the task. Indeed, the IRI-PT is correlated with et al., 2017). Notably, areas where accuracy was significantly the Mind Reading Motivation scale, a measure specifically aimed 1088 Social Cognitive and Affective Neuroscience, 2022, Vol. 17, No. 12 at the motivation to think about other people’s minds (Carpenter Funding et al., 2016). The work was supported by a grant from the Templeton World Charity Foundation to A.D. and J.K. Future directions Our searchlight analysis found large regions of the brain, many Conflict of interest previously implicated in affect-related processing, that signifi- cantly classified affective states. Despite this, only the accuracy The authors declare no conflicts of interest. in the vmPFC and insula was related to PT. This leaves the ques- tion of what traits, cognitive processes, or noise factors could potentially explain individual differences in classification accu- Supplementary data racy in other regions. It is possible that other cognitive traits, such as the ability to interpret other’s affective intent or bodily Supplementary data are available at SCAN online. perception, mayexplainhowaccurateotherregionssuchastem- poral gyrus (Wicker et al., 2003) and the parietal lobule (Engelen et al., 2015) are at distinguishing affective states in MVPA stud- References ies. Furthermore, the affective states we asked participants to Aglieri, V., Cagna, B., Velly, L., Takerkart, S., Belin, P. (2021). imagine are among the most common, ‘basic emotions’. These fMRI-based identity classification accuracy in left temporal and affective experiences are most commonly described as categori- frontal regions predicts speaker recognition performance. Scien- calindailylifeandthereforemaybemoreeasilydiscriminated.It tific Reports , 11(1), 489. remainsunclearwhetherthesamemechanisms,PTandaffective Anderson, A., Han, D., Douglas, P.K., Bramen, J., Cohen, M.S. simulation in the vmPFC, would be as accurate in discriminat- (2012). Real-time functional MRI classification of brain states ing mixed feelings or if other neural regions and cognitive pro- using Markov-SVM hybrid models: peering inside the rt-fMRI cesses may be differentially important for non-typical affective black box. In: Langs, G., Rish, I., Grosse-Wentrup, M., Murphy, B., states. (editors). Machine Learning and Interpretation in Neuroimaging. New Ourstudyhighlightstheimportanceofinvestigatingtheneural York: Springer, 242–55. substrate of individual differences in PT overall, especially across Bechara, A., Damasio, A.R., Damasio, H., Anderson, S.W. (1994). domains.WhilesomebrainregionsmaybeinvolvedbroadlyinPT Insensitivitytofutureconsequencesfollowingdamagetohuman across participants, univariate approaches may miss more spe- prefrontal cortex. Cognition, 50(1–3), 7–15. cific regions that truly distinguish successful vs unsuccessful PT. Bechara, A., Damasio, H., Damasio, A.R., Lee, G.P. (1999). Dif- Inouraffectivesimulationtask,thesewerethevmPFCandinsula, ferent contributions of the human amygdala and ventromedial althoughtheselikelymaybedifferentforPTtasksinvolvingother prefrontal cortex to decision-making. The Journal of Neuroscience, domains. 19(13), 5473–81. Benoit, R.G., Davies, D.J., Anderson, M.C. (2016). Reducing future fears by suppressing the brain mechanisms underlying episodic Conclusion simulation.ProceedingsoftheNationalAcademyofSciences, 113(52), E8492–501. In our study, we found a relationship between trait PT ability and Benoit, R.G., Paulus, P.C., Schacter, D.L. (2019). Forming attitudes via the classification accuracy of an individual’s vmPFC and insu- neural activity supporting affective episodic simulations. Nature lar activity for distinguishing task-evoked affective states. The Communications, 10(1), 1–11. value of these findings is important because it shows that the Bertossi,E.,Aleo,F.,Braghittoni,D.,Ciaramelli,E.(2016).Stuckinthe discriminability of signal in these regions, which exhibit high here and now: construction of fictitious and future experiences classification accuracy among affective states overall, is associ- following ventromedial prefrontal damage. Neuropsychologia, 81, ated with a task-relevant personal trait, namely PT. More work 107–16. is needed, however, to explore what underlying functional prop- Bertossi, E., Candela, V., De Luca, F., Ciaramelli, E. (2017). Episodic erties underlie successful classification ( Anderson et al., 2012; future thinking following vmPFC damage: impaired event con- Carlson and Wardle, 2015). struction, maintenance, or narration? Neuropsychology, 31(3), Methodologically, we show that searchlight MVPA can be used to uncover the mediating traits and processes, which Blair, R., Mitchell, D., Richell, R., et al. (2002). Turning a deaf ear explain why a particular region of the brain contributes to to fear: impaired recognition of vocal affect in psychopathic classification accuracy. Connecting MVPA results directly to individuals. Journal of Abnormal Psychology, 111, 682–6. individual traits or behaviors greatly enhances their inter- Bonte, M., Hausfeld, L., Scharke, W., Valente, G., Formisano, E. pretability. These findings reflect the strength of both mul- (2014). Task-dependent decoding of speaker and vowel identity tivariate analysis and the study of individual differences: from auditory cortical response patterns. Journal of Neuroscience, both seek to gain information from what ‘differs between’ 34(13), 4548–57. participants, rather than averages and commonalities across Bradley, M.M., & Lang, P.J. (2007). The International Affective Picture participants. System(IAPS)inthestudyofemotionandattention.In:Coan,J.A, Allen,J.J.B,(editors).HandbookofEmotionElicitationandAssessment, London: Oxford University Press. pp. 29–46. Acknowledgements Bukowski, H. (2018). The neural correlates of visual perspective tak- The authors would like to thank the Brain and Creativity Insitute ing: a critical review. Current Behavioral Neuroscience Reports, 5(3), for its continued support on this project. 189–97. A. G. Vaccaro et al. 1089 Bush, K.A., Privratsky, A., Gardner, J., Zielinski, M.J., Kilts, C.D. (2018). Finn, E.S., Scheinost, D., Finn, D.M., Shen, X., Papademetris, X., Common functional brain states encode both perceived emo- Constable, R.T.(2017).Canbrainstatebemanipulatedtoempha- tion and the psychophysiological response to affective stimuli. size individual differences in functional connectivity? Neuroim- Scientific Reports , 8(1), 1–10. age, 160, 140–51. Carlson, T.A., Wardle, S.G. (2015). Sensible decoding. Neuroimage, Fotopoulou, A., Tsakiris, M. (2017). Mentalizing homeostasis: the 110, 217–8. social origins of interoceptive inference. Neuropsychoanalysis, Carpenter, J.M., Green, M.C., Vacharkulksemsuk, T. (2016). Beyond 19(1), 3–28. perspective-taking:mind-readingmotivation.MotivationandEmo- Gessell, B., Geib, B., De Brigard, F. (2021). Multivariate pattern anal- tion, 40(3), 358–74. ysis and the search for neural representations. Synthese, 199, Celeghin, A., Diano, M., Bagnis, A., Viola, M., Tamietto, M. 12869–89. (2017). Basic emotions in human neuroscience: neuroimaging Hanke, M., Halchenko, Y.O., Sederberg, P.B., Hanson, S.J., Haxby, and beyond. Frontiers in Psychology, 8, 1432. J.V., & Pollmann, S. (2009). PyMVPA: a Python toolbox for Christov-Moore, L., Iacoboni, M. (2016). Self-other resonance, its multivariate pattern analysis of fMRI data. Neuroinformatics, 7, control and prosocial inclinations: brain–behavior relationships. 37–53. Human Brain Mapping, 37(4), 1544–58. Healey, M.L., Grossman, M. (2018). Cognitive and affective Clark-Polner, E., Johnson, T.D., Barrett, L.F. (2017). Multivoxel pat- perspective-taking: evidence for shared and dissociable anatom- tern analysis does not provide evidence to support the existence ical substrates. Frontiers in Neurology, 9, 491. of basic emotions. Cerebral Cortex (New York, N.Y.: 1991), 27(3), Hill, C.L., Updegraff, J.A. (2012). Mindfulness and its relationship to 1944–8. emotional regulation. Emotion, 12(1), 81. Corradi-Dell’Acqua, C., Hofstetter, C., Vuilleumier, P. (2014). Cogni- Hynes, C.A., Baird, A.A., Grafton, S.T. (2006). Differential role of the tive and affective theory of mind share the same local patterns orbital frontal lobe in emotional versus cognitive perspective- ofactivityinposteriortemporalbutnotmedialprefrontalcortex. taking. Neuropsychologia, 44(3), 374–83. Social Cognitive and Affective Neuroscience, 9(8), 1175–84. Israelashvili, J., Oosterwijk, S., Sauter, D., Fischer, A. (2019). Knowing Coutanche, M.N. (2013). Distinguishing multi-voxel patterns and me,knowingyou:emotiondifferentiationinoneselfisassociated mean activation: why, how, and what does it tell us? Cognitive, with recognition of others’ emotions. Cognition & Emotion, 33(7), Affective & Behavioral Neuroscience, 13(3), 667–73. 1461–71. Coutanche, M.N., Thompson-Schill, S.L., Schultz, R.T. (2011). Multi- Jenkinson, M., Smith, S. (2001). A global optimisation method for voxel pattern analysis of fMRI data predicts clinical symptom robust affine registration of brain images. Medical Image Analysis, severity. Neuroimage, 57(1), 113–23. 5(2), 143–56. Critchley, H.D. (2009). Psychophysiology of neural, cognitive and Kaplan, J.T., Meyer, K. (2012). Multivariate pattern analysis reveals affective integration: fMRI and autonomic indicants. International common neural patterns across individuals during touch obser- Journal of Psychophysiology, 73(2), 88–94. vation. Neuroimage, 60(1), 204–12. Damasio, A., Carvalho, G.B.(2013).Thenatureoffeelings: evolution- Kassam, K.S., Markey, A.R., Cherkassky, V.L., Loewenstein, G., aryandneurobiologicalorigins.NatureReviewsNeuroscience,14(2), Just, M.A. (2013). Identifying emotions on the basis of neural 143–52. activation. PLoS One, 8(6), e66032. Damasio, A.R. (1996). The somatic marker hypothesis and the pos- Keysers, C., Gazzola, V. (2007). Integrating simulation and theory sible functions of the prefrontal cortex. Philosophical Transactions of mind: from self to social cognition. Trends in Cognitive Sciences, of the Royal Society of London. Series B, Biological Sciences, 351(1346), 11(5), 194–6. 1413–20. Kim, J., Schultz, J., Rohe, T., Wallraven, C., Lee, S.-W., Bülthoff, H.H. Davis, T., LaRocque, K.F., Mumford, J.A., Norman, K.A., Wagner, A.D., (2015). Abstract representations of associated emotions in the Poldrack, R.A. (2014). What do differences between multi-voxel human brain. Journal of Neuroscience, 35(14), 5655–63. and univariate analysis mean? How subject-, voxel-, and trial- Kober, H., Barrett, L.F., Joseph, J., Bliss-Moreau, E., Lindquist, K., level variance impact fMRI analysis. Neuroimage, 97, 271–83. Wager, T.D. (2008). Functional grouping and cortical–subcortical Decety, J., Grèzes, J. (2006). The power of simulation: imagining one’s interactions in emotion: a meta-analysis of neuroimaging stud- own and other’s behavior. Brain Research, 1079(1), 4–14. ies. Neuroimage, 42(2), 998–1031. Decety, J., Meltzoff, A.N. (2011). Empathy, imitation, and the social Kragel, P.A., LaBar, K.S. (2015). Multivariate neural biomarkers of brain. In: Coplin, A., Goldie, P., editors. Empathy: Philosophical and emotional states are categorically distinct. Social Cognitive and Psychological Perspectives. Oxford University Press, 58–81. Affective Neuroscience, 10(11), 1437–48. Dukes, D., Abrams, K., Adolphs, R., et al. (2021). The rise of affec- Kragel,P.A.,LaBar,K.S.(2016).Decodingthenatureofemotioninthe tivism. Nature Human Behaviour, 5, 816–20. brain. Trends in Cognitive Sciences, 20(6), 444–55. D’Argembeau, A., Xue, G., Lu, Z.-L., Van der Linden, M., Bechara, A. Kriegeskorte, N., Douglas, P.K. (2019). Interpreting encoding and (2008). Neural correlates of envisioning emotional events in the decoding models. Current Opinion in Neurobiology, 55, 167–79. near and far future. Neuroimage, 40(1), 398–407. Lamm, C., Bukowski, H., Silani, G. (2016). From shared to distinct Eckland, N.S., Leyro, T.M., Mendes, W.B., Thompson, R.J. (2018). self-other representations in empathy: evidence from neurotyp- A multi-method investigation of the association between emo- ical function and socio-cognitive disorders. Philosophical Trans- tional clarity and empathy. Emotion, 18(5), 638. actions of the Royal Society of London. Series B, Biological Sciences, Engelen, T., de Graaf, T.A., Sack, A.T., de Gelder, B. (2015). A causal 371(1686), 20150083. role for inferior parietal lobule in emotion body perception. Cor- Meyer, K., Kaplan, J.T., Essex, R., Webber, C., Damasio, H., tex, 73, 195–202. Damasio, A.(2010).Predictingvisualstimulionthebasisofactiv- Erbas, Y., Sels, L., Ceulemans, E., Kuppens, P. (2016). Feeling me, ity in auditory cortices. Nature Neuroscience, 13(6), 667–8. feeling you: the relation between emotion differentiation and Mischkowski, D., Crocker, J., Way, B.M. (2019). A social analgesic? empathic accuracy. Social Psychological and Personality Science, 7(3), Acetaminophen (Paracetamol) reduces positive empathy. Fron- 240–7. tiers in Psychology, 10, 538. 1090 Social Cognitive and Affective Neuroscience, 2022, Vol. 17, No. 12 Morawetz, C., Bode, S., Baudewig, J., Jacobs, A.M., Heekeren, H.R. Sachs, M.E., Habibi, A., Damasio, A., Kaplan, J.T. (2018). Decod- (2016).Neuralrepresentationofemotionregulationgoals.Human ing the neural signatures of emotions expressed through sound. Brain Mapping, 37(2), 600–20. Neuroimage, 174, 1–10. Naselaris, T., Kay, K.N., Nishimoto, S., Gallant, J.L. (2011). Encoding Scarantino, A. (2012). Functional specialization does not require a and decoding in fMRI. Neuroimage, 56(2), 400–10. one-to-onemappingbetweenbrainregionsandemotions.Behav- Nummenmaa, L., Saarimaki, H. (2019). Emotions as discrete pat- ioral and Brain Sciences, 35(3), 161. terns of systemic activity. Neuroscience Letters, 693, 3–8. Schacter, D.L., Benoit, R.G., Szpunar, K.K. (2017). Episodic future Ochsner,K.N.,Knierim,K.,Ludlow,D.H.,etal.(2004).Reflectingupon thinking:mechanismsandfunctions.CurrentOpinioninBehavioral feelings: anfMRIstudyofneuralsystemssupportingtheattribu- Sciences, 17, 41–50. tion of emotion to self and other. Journal of Cognitive Neuroscience, Sebastian, C.L., Fontaine, N.M., Bird, G., et al. (2012). Neural pro- 16(10), 1746–72. cessingassociatedwithcognitiveandaffectiveTheoryofMindin Ogg, M., Moraczewski, D., Kuchinsky, S.E., Slevc, L.R. (2019). Sepa- adolescents and adults. Social Cognitive and Affective Neuroscience, rable neural representations of sound sources: speaker identity 7(1), 53–63. and musical timbre. Neuroimage, 191, 116–26. Seitz, R.J., Nickel, J., Azari, N.P. (2006). Functional modularity of Oosterwijk, S., Snoek, L., Rotteveel, M., Barrett, L.F., Scholte, H.S. the medial prefrontal cortex: involvement in human empathy. (2017). Shared states: using MVPA to test neural overlap between Neuropsychology, 20(6), 743. self-focused emotion imagery and other-focused emotion under- Singer, T., Seymour, B., O’doherty, J., Kaube, H., Dolan, R.J., Frith, C.D. standing.SocialCognitiveandAffectiveNeuroscience,12(7),1025–35. (2004). Empathy for pain involves the affective but not sensory Paquette, S., Takerkart, S., Saget, S., Peretz, I., Belin, P. (2018). components of pain. Science, 303(5661), 1157–62. Cross-classification of musical and vocal emotions in the audi- Skerry, A.E., Saxe, R. (2014). A common neural code for per- tory cortex. Annals of the New York Academy of Sciences, 1423, ceived and inferred emotion. Journal of Neuroscience, 34(48), 329–37. 15997–6008. Parkinson, B., Manstead, A.S. (2015). Current emotion research in Smith, R., Lane, R.D. (2015). The neural basis of one’s own conscious social psychology: thinking about emotions and other people. and unconscious emotional states. Neuroscience and Biobehavioral Emotion Review, 7(4), 371–80. Reviews, 57, 1–29. Peelen, M.V., Atkinson, A.P., Vuilleumier, P. (2010). Supramodal rep- Smith, S.M., Jenkinson, M., Woolrich, M.W., et al. (2004). Advances resentationsofperceivedemotionsinthehumanbrain.Journalof in functional and structural MR image analysis and implemen- Neuroscience, 30(30), 10127–34. tation as FSL. NeuroImage, 23(S1): 208–19. Poppa, T., Bechara, A. (2018). The somatic marker hypothesis: Tettamanti,M.,Rognoni,E.,Cafiero,R.,Costa,T.,Galati,D.,Perani, D. revisiting the role of the ‘body-loop’ in decision-making. Current (2012). Distinct pathways of neural coupling for different basic Opinion in Behavioral Sciences, 19, 61–6. emotions. Neuroimage, 59(2), 1804–17. Raizada, R.D., Kriegeskorte, N. (2010). Pattern-information fMRI: Thompson, R.J., Boden, M.T. (2019). State emotional clarity and new questions which it opens up, and challenges which face it. attention to emotion: a naturalistic examination of their asso- International Journal of Imaging Systems Technology, 20, 31–41. ciations with each other, affect, and context. Cognition & Emotion, Raizada, R.D., Tsao, F.M., Liu, H.M., Holloway, I.D., Ansari, D., 33(7), 1514–22. Kuhl, P.K. (2010). Linking brain-wide multivoxel activation pat- Vaccaro, A.G., Kaplan, J.T., Damasio, A. (2020). Bittersweet: the neu- terns to behaviour: examples from language and math. Neuroim- roscienceofambivalentaffect.PerspectivesonPsychologicalScience, age, 51(1), 462–71. 15(5), 1187–99. Ritchie, J.B., Kaplan, D.M., Klein, C. (2019). Decoding the brain: neu- van der Heiden, L., Scherpiet, S., Konicar, L., Birbaumer, N., ralrepresentationandthelimitsofmultivariatepatternanalysis Veit, R. (2013). Inter-individual differences in successful perspec- in cognitive neuroscience. The British Journal for the Philosophy of tive taking during pain perception mediates emotional respon- Science, 70(2), 581–607. siveness in self and others: an fMRI study. Neuroimage, 65, Roy, M., Shohamy, D., Wager, T.D. (2012). Ventromedial prefrontal- 387–94. subcortical systems and the generation of affective meaning. Wicker, B., Perrett, D.I., Baron-Cohen, S., Decety, J. (2003). Being the Trends in Cognitive Sciences, 16(3), 147–56. target of another’s emotion: a PET study. Neuropsychologia, 41(2), Rütgen, M., Seidel, E.-M., Silani, G., et al. (2015). Placebo analge- 139–46. sia and its opioidergic regulation suggest that empathy for pain Winecoff, A., Clithero, J.A., Carter, R.M., Bergman, S.R., is grounded in self pain. Proceedings of the National Academy of Wang, L., Huettel, S.A. (2013). Ventromedial prefrontal cor- Sciences, 112(41), E5638–46. tex encodes emotional value. Journal of Neuroscience, 33(27), Rütgen, M., Wirth, E.-M., Riecanský, I., et al. (2021). Beyond sharing 11032–9. unpleasant affect—evidence for pain-specific opioidergic modu- Winkler, A.M., Ridgway, G.R., Webster, M.A., Smith, S.M., Nichols, lation of empathy for pain. Cerebral Cortex, 31(6), 2773–86. T.E. (2014). Permutation inference for the general linear model. Saarimaki, H., Ejtehadian, L.F., Glerean, E., et al. (2018). Distributed Neuroimage, 92, 381–97. affective space represents multiple emotion categories across Zhou,F.,Li,J.,Zhao,W.,etal.(2020).Empathicpainevokedbysensory the human brain. Social Cognitive and Affective Neuroscience, 13(5), andemotional-communicativecuessharecommonandprocess- 471–82. specific neural representations. Elife, 9, e56929. Saarimaki, H., Gotsopoulos, A., Jaaskelainen, I.P., et al. (2016). Dis- Zhou, F., Zhao, W., Qi, Z., et al. (2021). A distributed fMRI-based crete neural signatures of basic emotions. Cerebral Cortex, 26(6), signature for the subjective experience of fear. Nature Communi- 2563–73. cations, 12(1), 1–16. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Social Cognitive and Affective Neuroscience Oxford University Press

Perspective-taking is associated with increased discriminability of affective states in the ventromedial prefrontal cortex

Loading next page...
 
/lp/oxford-university-press/perspective-taking-is-associated-with-increased-discriminability-of-s06Td0xlDo

References (95)

Publisher
Oxford University Press
Copyright
© The Author(s) 2022. Published by Oxford University Press.
ISSN
1749-5016
eISSN
1749-5024
DOI
10.1093/scan/nsac035
Publisher site
See Article on Publisher Site

Abstract

Recent work using multivariate-pattern analysis (MVPA) on functional magnetic resonance imaging (fMRI) data has found that dis- tinct affective states produce correspondingly distinct patterns of neural activity in the cerebral cortex. However, it is unclear whether individual differences in the distinctiveness of neural patterns evoked by affective stimuli underlie empathic abilities such as perspective-taking (PT). Accordingly, we examined whether we could predict PT tendency from the classification of blood-oxygen- level-dependent (BOLD) fMRI activation patterns while participants (n=57) imagined themselves in affectively charged scenarios. We used an MVPA searchlight analysis to map where in the brain activity patterns permitted the classification of four affective states: happiness, sadness, fearanddisgust. Classificationaccuracywassignificantlyabovechancelevelsinmostofthe prefrontalcortexand in the posterior medial cortices. Furthermore, participants’ self-reported PT was positively associated with classification accuracy in the ventromedial prefrontal cortex and insula. This finding has implications for understanding affective processing in the prefrontal cortex and for interpreting the cognitive significance of classifiable affective brain states. Our multivariate approach suggests that PT ability may rely on the grain of internally simulated affective representations rather than simply the global strength. Key words: emotion; perspective-taking; multivariate-pattern analysis; ventromedial prefrontal cortex Manstead, 2015; Fotopoulou and Tsakiris, 2017; Dukes et al., Introduction 2021). Empathy is a multifaceted construct combining cogni- Contemporary neuroscience research has highlighted the com- tive processes that allow us to understand the internal states plex relationship between neural activity and affective states. of others and affective processes that allow us to share in the We define ‘affective states’ to include both ‘emotions’, which in internal states of others. These include aversive reactions to oth- ourviewcomprisephysiologicalandmotoricresponsestostimuli ers’ distress [personal distress (PD)], concern for others’ welfare in the environment that are relevant to the homeostatic wel- [empathic concern (EC)], feeling and understanding the expe- fare of the organism, and ‘feelings’, the conscious perceptions of riences of hypothetical or absent others (fantasizing) and tak- emotion-related changes in the body. Affective states appear to ing others’ perspectives [perspective-taking (PT)] (Davis, 1983). involve interactions between cortical and sub-cortical regions, as Researchhasfoundthatsystemsinvolvedinunderstandingone’s wellwiththeviscera(forreview:Koberetal.,2008;Critchley,2009; own emotions are also involved in understanding the affec- Tettamanti et al., 2012; Damasio and Carvalho, 2013; Smith and tive states of others (Ochsner et al., 2004). Empathizing with Lane, 2015; Vaccaro et al., 2020). another’s feelings recruits affective brain regions involved in rep- Recently, our understanding of emotion has progressed from resenting one’s own affective state (Singer et al., 2004; Lamm considering solely the experience and mechanisms of individ- et al., 2016) and there is evidence that impaired affective expe- ual experience, understanding that the consideration of other’s rience (as in psychopathy) may limit empathic abilities (Blair minds plays a large role in one’s own affective experience et al., 2002). For example, participants administered an analgesic (for review: Decety and Meltzoff, 2011; Christov-Moore and were impaired in their ability to recognize and respond to oth- Iacoboni, 2016; Lamm et al., 2016). While the core mechanisms ers’ pain (Mischkowski et al., 2019). Furthermore, it has been of affect were once viewed and researched as a primarily pri- found that placebo analgesia reduces both pain and empathy for vate, intraindividual process, there is a growing consensus that pain (Rütgen et al., 2015), as well as both unpleasant touch and the development of affect is inescapably linked to sociality: this empathy for it through modulation of the insular cortex (Rütgen places empathetic processes as more core to individual affect et al., 2021). than they were originally considered (for review: Parkinson and Received: 4 March 2021; Revised: 5 April 2022; Accepted: 16 May 2022 © The Author(s) 2022. Published by Oxford University Press. This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial License (https://creativecommons.org/licenses/by-nc/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited. For commercial re-use, please contact journals.permissions@oup.com A. G. Vaccaro et al. 1083 One feature of affective experience that is relevant to both the ability or motivation to deliberately simulate affective states empathizing with others and representing one’s own state is the fromconceptsissimilartowhathasbeenproposedinthesomatic extent of differentiation among affective states. Previous studies marker hypothesis for the vmPFC, generating feelings ‘as-if’ one have shown that individuals who are more successful at judg- is in a scenario (Damasio, 1996). It is possible that this individ- ing the affective states of others experience more differentiated ual variability presents itself in the distinctiveness of the neural categories of affect (Erbas et al., 2016; Israelashvili et al., 2019). states evoked in response to different cues. In line with previous Having, and being able to simulate, affective states that are cat- work on the overlap between empathy and the representation of egorically discernable may facilitate skills such as mentalizing, one’s own affective states, a trait level measure of empathy may empathy and PT due to the perceived increase in clarity of what be associated with these differences. one is feeling and therefore what that feeling means function- In our study, participants underwent an affect induction ally (Hill and Updegraff, 2012; Eckland et al., 2018; Thompson and paradigm in which they viewed pictures of situations invoking Boden, 2019). Therefore, we hypothesize that increased neural fear, happiness, sadness and disgust, alongside captions describ- differentiation of affective states may be associated with greater ing the scenario from a first-person perspective. It is likely that empathy. this type of paradigm evokes both emotions and feelings; thus, Recent studies using the multivariate-pattern analysis (MVPA) it allows us to investigate the correlates of affective states as a find that distinct affective states may be associated with spe- whole but not to differentiate between neural patterns for emo- cific patterns of neural activity within a network of brain regions tion and for feeling. We ran two sets of MVPA analyses on the (Scarantino, 2012; Celeghin et al., 2017; Nummenmaa and evoked neural data to investigate the classification accuracy of Saarimaki, 2019). MVPA studies have demonstrated that discrete, emotions from patterns of the fMRI activity. In the first, we induced affective states can be accurately distinguished from examined which regions’ activation was most informative for each other (i.e. classified) using patterns of BOLD activation in classifying the four evoked affectivestates. Wehypothesized that functionalmagneticresonanceimaging(fMRI)data(Kassametal., the mPFC would have the highest classification accuracy. In the 2013; Kragel and LaBar, 2015; Saarimaki et al., 2016; Zhou et al., second analysis, we attempted to predict individual differences 2021). The most commonly studied of these states in MVPA stud- in empathic ability from the classification accuracy of partici- ies are sadness, disgust, fear, happiness and, to a lesser extent, pants’ patterns of neural activation during emotion induction. anger, all classically considered ‘basic emotions’ (Saarimaki et al., We hypothesized that individual differences in empathic abil- 2016; Celeghin et al., 2017). However, other more subtle affective ity would be reflected in the distinctiveness of neural patterns states have also been studied, such as shame, envy, contempt, of activation evoked by different emotions. Given the mPFC’s pride, guilt and longing (Kassam et al., 2013; Kragel and LaBar, prominence in MVPA studies of emotions, as well as its role 2015; Saarimaki et al., 2018), although these may be less easily in affective PT, we hypothesized that classification accuracy for classified than their more ‘basic’ cousins ( Saarimaki et al., 2018). emotions in this region would show the highest correspondence Cortical regions found to contribute most to classification with empathic abilities. Note that our predictions concerned accuracy in MVPA studies tend to be consistent with those found empathic ability in general; since we did not hypothesize which in univariate analyses of affective processing. These regions specificcomponentsofempathywouldcorrelatewiththedistinc- include the medial prefrontal cortex (mPFC), inferior frontal tiveness of neural patterns, our analysis was exploratory with gyrus, posterior medial cortex, insula and amygdalae (Peelen respect to the empathy sub-scales. et al., 2010; Kim et al., 2015; Saarimaki et al., 2016, 2018; Sachs et al., 2018). A key region involved in both judging another’s Methods affective state and representing one’s own affective state is the mPFC (Seitz et al., 2006). Specifically, the ventral areas of mPFC Healthy adult participants (n=57) were recruited as part of have been shown to play a selective role in affective PT as com- two different studies, all recruited through flyers from the Uni- pared to cognitive PT (or general theory of mind; Hynes et al., versity of Southern California and surrounding Los Angeles 2006;Corradi-Dell’Acquaetal.,2014;HealeyandGrossman,2018). area. Thirty-six participants’ data (18 female, age=24.21±8.68, Interestingly, MVPA results have further suggested that some range=18–52) were collected in the first study. In a second of the mechanisms involved in representing one’s own affective study, 21 more participants’ data (11 female, age=22.67±6.45, state overlap with the mechanisms for empathy. The insula has range=18–42) were collected. Since the two studies used the been shown to have shared neural representations for pain and same experimental paradigm and stimuli, with slight differences empathy for pain (Zhou et al., 2020). detailed below, the data were combined to increase statisti- Empathy is often an implicit part of paradigms used to study cal power. All participants were right-handed, had normal or emotion differentiation. In order to experimentally invoke affec- corrected-to-normal vision, no history of neurological or psy- tive states inside the fMRI scanner, it is common to present chiatric conditions, All participants gave informed consent in subjects with affect-provoking stimuli and also to engage sub- accordance with the institutional review board approval guide- jectsinvoluntarymentalsimulation.Forinstance,Saarimakietal. lines approved by the University of Southern California. Because (2018) used narratives that describe the lead-up to an emotional thesecondstudyinvolvedamorecomprehensivebatteryofaddi- event along with a guided imagery technique to evoke 14 differ- tionalbehavioralmeasuresnotusedintheanalysesofthispaper, ent affective states. It can be difficult or impractical to design behavioral data for those participants was collected on a dif- stimuli that effectively induce genuine affect. Tasks often involve ferent day. For this reason, 2 of the 21 participants did not explicitlyaskingparticipantstoimaginethemselvesinemotional provide behavioral data collected before the university shutdown scenarios based on visual or audio imagery. Paradigms such as due to coronavirus disease (COVID-19), leaving us with 19 par- this require participants to access their concepts of emotion. ticipants from this second study for analyses relating to the Interestingly, this naturally creates individual variability where empathy measures (9 female, age=22.32±6.14, range=18–42): someindividualscaneasilygeneratestrongfeelingsfromretriev- a total of 55 participants between the two studies for this second ing emotional concepts while others cannot. This difference in analysis. 1084 Social Cognitive and Affective Neuroscience, 2022, Vol. 17, No. 12 Interpersonal reactivity index the described emotional situation as stronglyas possible for each stimulus. BehavioralmeasuresofempathywereacquiredthroughtheInter- personal Reactivity Index (Davis, 1983). This self-report measure fMRI data acquisition consists of four seven-item sub-scales: (i) PT: the ability of the All scanning was completed on a 3T Siemens Prisma System participant to take on the point of view of another individual Scanner at the USC Dornsife Cognitive Neuroimaging Center (for example: ‘Before criticizing somebody, I try to imagine how I using a 32-channel head coil. Anatomical images were acquired wouldfeelifIwereintheirplace’),(ii)fantasy(FS):thetendencyof with a T1-weighted magnetization-prepared rapid gradient-echo the participant to identify with fictitious characters (for example: sequence (repetition time [TR]/echo time [TE]=2300/2.26, voxel ‘I really get involved with the feelings of characters in a novel’), size 1-mm isotropic voxels, flip angle 9 ). Functional images were (iii) EC: the presence of the participant’s feeling of compassion acquired with a T2*-weighted gradient-echo sequence (repeti- or concern for others (for example: ‘I am often quite touched tion time [TR]/echo time [TE]= 2000/25ms, 41 transverse 3-mm by things I see happen’ and (iv) PD: the presence of the partic- slices, flip angle 90 ). A T2-weighted volume was acquired for ipant’s feeling of discomfort or anxiety for others (for example: blind review by an independent neuroradiologist, in compliance ‘When I see someone who badly needs help in an emergency, I with the scanning center’s policy and local Institutional Review go to pieces’; Davis, 1983). For each participant, each sub-scale Board guidelines. T2-weighted scans were not analyzed by the scorewasassessedseparately,resultinginfourdistinctscoresper researchers for any purpose in this study. participant. Stimuli fMRI analysis Stimuli were presented as one photo in the center of the screen Preprocessing and GLM with anecdotal descriptive text underneath each photo. Photos Data were first processed using fMRI expert analysis tool, FMRIB were first gathered from a subset of images in the International Software Library (FSL)’s implementation of the General Linear Affective Pictures Set (IAPS; Bradley and Lang, 2007) covering Model (GLM; FMRIB Software Library, Smith et al., 2004) to gen- the affective categories of happiness, fear, sadness and disgust. erate voxel-wise z-statistic maps showing voxels that responded Text sentences from the Affective Norms for English Text (ANET; significantly to each emotion type for each participant. Those BradleyandLang,2007)werealsochoseninthesefourcategories. z-statistic maps were then used for the classification analy- Stimulicaptionsarewritteninthesecondperson, tellingthesub- sis. Data preprocessing was conducted in FSL (FMRIB Software ject what they were experiencing (example: ‘As you leave the Library, Smith et al., 2004) using brain extraction, slice-time and concert, a drunk vomits all over your jacket, soaking it.’). Pic- motion correction using motion-correction fMRIB’s linear regis- tures from the IAPS were then matched with a corresponding tration tool, spatial smoothing (5mm) and high-pass temporal piece of text from the ANET that described a situation associ- filtering (sigma =50s). The functional data were registered to ated with the picture. For example, a picture of a snarling dog eachparticipant’sownanatomicalimageandtheanatomicaldata wascombinedwiththecaption‘Thedogstrainsforward,snarling wereregisteredtothestandardMNIBrain(MontrealNeurological and suddenly leaps out at you’ (See Supplementary Table S1 for Institute) using FSL’s fMRIB’s non-linear registration tool (FNIRT) examples). tool (Jenkinson and Smith, 2001). The data were modeled with a For pictures that did not have appropriately matching text separate regressor for each of the four emotions (happy, sad, fear from ANET or text that did not have appropriately matching and disgust), one for the neutral condition, the temporal deriva- images from the IAPS, text/images were written to fit or acquired tives of all task regressors and six motion parameters to account from the web. These new images were rated for valence and forresidualmotioneffects.Thesesamesmoothedstandardspace arousal by 51 participants in an earlier study (Supplementary z-maps were used for both the emotion discrimination analysis Tables S2 and S3). Subjects were also asked to indicate what and for the individual subject searchlights. category of affective state each photo/text combination cor- responded to. For every stimulus selected for the study, the MVPA analysis expected category (among happy, sad, fear, disgust and neutral) Emotion discrimination analysis was the most commonly picked category by the subjects (see Supplementary Figure S1). In addition to these emotional stim- All MPVA analyses were conducted using PyMVPA (Hanke et al., uli, non-emotional/neutral stimuli were used as a control and a 2009). A whole-brain searchlight analysis was conducted to iden- fixation cross was used as a rest. Since our goal was to predict tify regions whose activation patterns allowed us to classify the emotionalstateswithMVPA,theanalysisoftheneutralimagesis four emotions across all subjects’ data. The input data to the not presented here. classifier was a single 4D image file combining z-stat maps (nor- malized to MNI standard space using FNIRT) for each affective state, functional run and participant (resulting in a total of 828 Functional neuroimaging: fMRI concatenated images—36 participants × 4 runs × 4 emotions for fMRI design study 1 combined with 21 participants × 3 runs × 4 emotions for Stimuliwerepresentedinanevent-relateddesignusingMATLAB’s study 2). For every voxel in the brain, a sphere centered on that Psychtoolbox. In study 1, 60 stimuli (photo+text) were randomly voxel (radius=5 voxels) was used to train and test a linear sup- presented during 4 functional runs (15 stimuli per run, 6min per portvectormachine(SVM)usingaleave-one-outcross-validation. run). Each stimulus was presented for 12s followed by a 12s fix- In other words, in each iteration, the classifier was trained on all ation cross in between each trial as a ‘rest’ period. In study 2, the participants’ data except one and then tested on the remain- 45 stimuli were randomly presented during 3 functional runs. ing participant’s data leading to 55 cross-validation folds. The In both studies, participants were instructed to lay still, observe resulting average accuracy over all iterations, after leaving each the displayed photograph, read the text and attempt to embody participant out once, was mapped to the center voxel of the A. G. Vaccaro et al. 1085 sphere, ultimately resulting in a cross-participant map of clas- comparisons in the correlation analysis, we first performed a sification accuracies for every voxel. For the SVM regularization resel-wise Bonferroni correction: we determined the total num- parameter C, we used the default in PyMVPA, which chooses this ber of five voxel radius independent spheres, which could fit in parameter automatically according to the norm of the data. the standard brain (∼725) and divided 0.05 by this number to −5 determineouralpha(6.89×10 )(KaplanandMeyer,2012).There- fore, a classification accuracy for which greater than or equal Empathy correlation analysis values appeared in our distribution less than 10 times would To correlate the scales of the Interpersonal Reactivity Index (IRI) be considered significantly above chance. In our simulated null with individual classification accuracy, we first computed whole- distribution, where there were four emotions and an expected brainsearchlights‘within’everyindividualparticipant’sdata. We chance accuracy of 0.25, a significant above chance classification ran searchlights on each individual subject’s data in standard accuracy was determined to be>0.30592. The maximum value space. For each participant, a sphere (radius=3 voxels) cen- found in our null distribution was 0.311 (Supplementary Figure tered on every voxel in the brain was used to iteratively assess S2). For the second correction method in both analyses, we used classification accuracy throughout the brain. For each sphere, a voxel-wise Bonferroni correction (0.05/n tests) and determined the SVM classifier was trained on all but one of the emotions’ thecorrespondingaccuracyvalueusingthebinomialdistribution. functional scanning runs and tested on the left out run. The Withthismethod,weestablishedthataBonferroni-correctedsig- resulting accuracy values were mapped to the center voxel of the nificance of 0.05 would correspond to an accuracy threshold of sphereandresultingwhole-brainvoxel-wiseaccuracymapswere 0.3815. To enhance the replicability of our findings and isolate warped into the standard space. This created an accuracy map the most informative regions, we opted in both analyses to use for each participant where a voxel’s value represented the clas- the more conservative threshold (permutation-corrected results sification accuracy of the three voxel spheres surrounding that can be found in the appendix: Supplementary Figure S3). This voxel. We chose a smaller sphere size than the between-subjects thresholdisextremelyconservativegiventhatitisbasedonafull analysis to increase our spatial resolution. Furthermore, these voxel-wise Bonferroni correction and is also greater than any of individual subject searchlights required significantly less com- the classification results in our over 150000 permutations. puting power than the between-subjects analysis, so we were In post-hoc analyses, we wanted to determine if our results in less restricted by computing limitations. To identify relation- theregionsthatsignificantlydistinguishedbetweenthefouremo- ships between these individual participant searchlight maps and tions were driven by one emotion being uniquely distinguishable individual differences in self-reported empathy, we used FSL’s compared to the others. To do this, we ran pair-wise searchlight Randomise tool (with FWE correction). We created a series of analyses classifying each of six possible pairs of emotions. If one regressors using participants’ demeaned scores on each of the specificemotionwasdrivingourfindings,onlypairswiththatone IRI’s sub-scales (PT, EC, PD and FS). These regressors were then emotionwouldshowasimilarspatialpatternofsignificantclassi- relatedtothesearchlightaccuracyvaluesusingRandomise’snon- ficationtoourinitialfour-waysearchlightemotiondiscrimination parametric permutation testing approach (Winkler et al., 2014). analysis. At each voxel, the accuracies across subjects are randomly asso- ciated with the regressor-values (the empathy sub-scales) and a teststatisticiscomputed.Thispermutationprocedureisrepeated Results 5000 times to generate a null distribution. One of the result- Maps of the statistical results for this study can be found on ing maps is then a map of 1 minus the P-value for the asso- the Open Science Foundation website for this study, https://osf. ciation between the regressor and the classification accuracy, io/8wcnz/. determined by comparison to the null distribution. These maps were corrected for family-wise error using the FSL’s threshold- Affect discrimination searchlight free cluster enhancement algorithm. We restricted our interpre- Multiple regions significantly predicted affect classification, tation of the resulting maps to regions that significantly pre- even using the more conservative (voxel-wise Bonferroni- dicted emotions in the group searchlight analysis by masking corrected) threshold (compared with the four-way these maps with the regions found to significantly distinguish chancelevelof0.25)(SeeFigure1).TheseincludedvmPFC(x=−9, affective states at the group level in the initial affect discrimi- y=55, z=−16; accuracy=0.444), anterior prefrontal cortex nation analysis. Therefore, the final resulting map shows voxels (x= −9, y=55, z=12; accuracy=0.436), dorsomedial prefrontal where both (i) classification was above chance across all sub- cortex (x=14, y=36, z=46; accuracy=0.419), bilateral insula jects and (ii) classification varied significantly with the empathy (left x=−40, y=−11, z=9; accuracy=0.441, right x=44, scores. y=−6, z=8; accuracy=0.420), bilateral amygdala (left x=−28, y=−10, z= −12; accuracy=0.412; right x=26, y=−10, z= Statistical thresholding −12; accuracy=0.417), posterior cingulate cortex (x=−2, y=−52, To determine an appropriate threshold for significance, we z=14; accuracy=0.425), bilateral temporal gyrus (x=−38, employed two parallel methods to correct for multiple com- y=−54, z=−8; accuracy=0.446) (x=33, y=−43, z=−10; accu- parisons in each of our analyses. For the affect discrimination racy=0.448) and bilateral superior parietal lobule (left x=−32, analysis, we first used permutation testing to create a null distri- y=−69, z=37; accuracy=0.428; right x=33, y=−63, z=30; bution of classification accuracies within a simulated searchlight accuracy=0.426). sphere by shuffling affect labels. We ran 151801 permutations These spatial patterns, especially the high classification val- of the classification within a single-sample searchlight sphere. ues in medial frontal and parietal areas, were mirrored in all This allowed us to create a distribution of classification values the post hoc pair-wise classification searchlights except happy vs that might occur in a given searchlight sphere assuming the sad, which did not reach a significant threshold of 0.68 accuracy null hypothesis that the four affect conditions cannot be dis- (see Supplementary Figures S4–S8). This suggests that our four- tinguished from patterns of activation. To account for multiple way classification results did not result from one emotion being 1086 Social Cognitive and Affective Neuroscience, 2022, Vol. 17, No. 12 often ‘black-box’ like scenarios. We demonstrate that an individ- ual trait may be related to differences in the distinguishability of neural patterns corresponding to affective states. This method provides a new way of exploring and theorizing why neural pat- terns can be distinguished in this way and demonstrates a new avenue of inference for testing how personal traits and cognitive functions might be associated with how different neural states are discernible from each other. In this case, we found that self- reported PT is related to the discernibility of the affective states our participants were asked to imagine. Our study has implica- tions for both the use of MVPA in general and our understanding of how empathetic traits relate to affect. Relating MVPA performance to individual Fig. 1. Brain regions in which the searchlight analysis significantly differences predicted the four affect categories. After Bonferonni correction for It has long been recognized that connecting an MVPA result every voxel in the brain, a corrected P-value of 0.05 corresponded to a directly to some measured aspect of the underlying behavior or classification accuracy threshold of ∼0.38. psychologybeing studiedisimportant (RaizadaandKriegeskorte, 2010; Naselarisetal., 2011). Otherwise, successfullydecodedneu- ral signals may reflect information that is either not used at all by the brain or is not relevant to aspects of the psychology in question (Kriegeskorte and Douglas, 2019; Ritchie et al., 2019). Nevertheless,itisuncommonforMVPAstudiestorelateclassifier performance to individual differences. Using procedures similar toours,Coutancheetal. (2011)foundthattheMVPAclassification ofvisualstimulicorrelatedwithAutismSpectrumDisordersymp- tomseverityacrosssubjects.Studiesintheauditorydomainhave found that MVPA classification accuracy in a musical instrument identification task ( Ogg et al., 2019) and a speaker identifica- tion task (Bonte et al., 2014; Aglieri et al., 2021) correlate with between-subject differences in accuracy on those tasks. Raizada et al. (2010) found that the performance of a classifier at distin- guishing neural patterns related to the perception of different phonemeswasrelatedtosubjects’performanceindiscriminating Fig. 2. Brain regions with significant classification accuracy where PT on the same phonemes and that the separability of neural patterns the Interpersonal Reactivity Index significantly predicted classifier accuracy. during a numerical discrimination task was related to arithmetic performance. Similarly, Meyer et al. (2010) showed that classifier discrimination of imagined auditory stimuli was correlated with the reported vividness of imagination. distinguishable compared to the rest, or one pair of emotions Yet, while linking MVPA performance to individual differences beingespeciallydistinguishablecomparedtoothercombinations. can make headway in showing the psychological relevance of classification performance, many of the limitations that exist Classification accuracy and IRI measures in other studies attempting to relate individual differences to See Supplementary Table S4 and Supplementary Figure S9 for the neuroimaging analysis still remain. For example, to opti- descriptive statistics and distributions of sub-scales. FS, PD and mize studying individual differences, the states induced should ECwerenotsignificantlyrelatedtoclassificationaccuracy.PTwas produce enough variability across individuals while still main- significantly related to classification accuracy ( P <0.05 with FWE taininghighidentifiabilityofstimulusrepresentations( Finnetal., correction) in a large area of vmPFC (x=−15, y=48, z=−7), as 2017).Intermsofidentifiabilityofneuralpatterns,MVPAtendsto well as in bilateral, although predominantly left, insula (x=−39, be more sensitive than the univariate analysis. However, it also y=−7, z=7) (x=43, y=−5, z=−6; see Figures 2 and 3). tends to be less sensitive to between-subject variability in mean activation levels (Davis et al., 2014). It remains to be determined whether the variability across people in classifier performance Discussion is optimal for the study of individual differences. Furthermore, MVPA has become a popular method in affective neuroscience while the trait relevance of classifier performance is informative, over the past decade (Coutanche, 2013; Morawetz et al., 2016; interpretation must always be careful in that such relationships Oosterwijk et al., 2017). While it has been demonstrated repeat- can always be mediated by additional, unmeasured variables. edly that neural patterns associated with affective states can be distinguished from each other, the interpretation of what these Decoding of affective states correlates with PT patterns tell us about categories, and how they come to be, is contentious (for review: Coutanche, 2013; Kragel and LaBar, 2016; We found multiple brain regions whose activity permitted us to Clark-Polner et al., 2017; Gessell et al., 2021). In our study, we robustly classify evoked affective states. Many of these brain usedanovelapproachtoincreaseourpowerofinferenceinthese regions have allowed successful classification of affect in prior A. G. Vaccaro et al. 1087 Fig. 3. Scatterplots of PT vs classification accuracy in the ventromedial prefrontal cortex and insula. studies: the posterior cingulate, insula, temporal gyrus, medial associated with PT showed little overlap with regions typically prefrontal and ventral medial prefrontal regions have all been implicated in univariate studies of PT: regions such as visual cor- implicated in previous MVPA studies of emotions and affective tex,temporal–parietaljunctionanddorsolateralprefrontalcortex context (Skerry and Saxe, 2014; Saarimaki et al., 2016; Oosterwijk (Decety and Grèzes, 2006; Bukowski, 2018; Healey and Grossman, et al., 2017; Bush et al., 2018; Paquette et al., 2018; Sachs et al., 2018). Accuracy in these areas did not correlate with our PT mea- 2018). sure. Additionally, even though it is conceivable that our visual ThevmPFChaslongbeenimplicatedinimplementingemotion and descriptive prompts would support a largely visual simula- and feeling (for review: Bechara et al., 1994, 1999; Roy et al., 2012; tion of the scene or that improved affective representation might Winecoff et al., 2013). A theoretical link between affect and the result from increased attention, neither primary visual regions vmPFC is provided by the ‘somatic marker hypothesis’ (Damasio, nor parietal and temporal visuospatial regions provided signifi- 1996). ThisviewpositsthatthevmPFCisakeyregionforincorpo- cant classification accuracy. Visual, temporal–parietal and dorsal rating emotion-related signals from the body, consciously expe- prefrontal regions may be important for the general process of rienced as feelings, into our cognitive decision-making process. PT but more related to the general effort involved in perform- According to the SMH, the vmPFC also permits us to bypass pure ing the task rather than its success. In a study where individuals bodily input in order to simulate these feelings even without the wereinstructedtotaketheperspectivethatanimageofapainful direct presence of a trigger (Poppa and Bechara, 2018). By this stimulus was occurring to either themselves or another person account, the vmPFC could serve to ‘simulate’ the affective expe- (van der Heiden et al., 2013), the main effects of the different PT rience being evoked (Keysers and Gazzola, 2007; Schacter et al., conditions were evident in regions such as supramarginal gyrus, 2017). Indeed, studies on affective PT vs cognitive PT have impli- temporal gyrus, frontal gyrus and ventrolateral PFC. However, cated the vmPFC (Sebastian et al., 2012; Healey and Grossman, whentheeffectsofgoodvsbadperspective-takerswereanalyzed, 2018). The vmPFC has also been implicated in affective simula- alltheseregionsexceptventrolateralPFCwerenotsignificantpre- tion in studies in which participants imagined affective contexts dictors, and instead the left insula, postcentral gyrus and vmPFC thatcouldhappentotheminthefuture(D’Argembeauetal.,2008) differed between the groups. Perhaps the insula and vmPFC are orthathavehappenedinthepast(Benoitetal.,2016,2019).Corre- specifically important for generating a fine-grained affective sim- spondingly,lesionstovmPFCappeartohamperboththeabilityto ulation, and this granularity is reflected in more distinct neural simulate future affective scenarios and imaginary ones (Bertossi patterns. et al., 2016, 2017). We note also that while the IRI includes items that ask partici- Importantly, we show here that the vmPFC’s role in affec- pantstojudgetheireaseordifficultyatempathizing(‘Isometimes tive PT is related not just to the level of the neural activity find it difficult to see things from the “other guy’s” point of view’.) invoked, but to the distinctiveness of its affect-related neural many of the questions are instead about the ‘tendency’ to take patterns. This could suggest that affective PT may relate to the someone else’s perspective (‘I sometimes try to understand my simulation of the affective experience that is more specific to friends better by imagining how things look from their perspec- the target emotion, reflected functionally in more easily clas- tive’.) or the ‘motivation’ to do so (‘I try to look at everybody’s sifiable representations of affective context. If simulated affec- side of a disagreement before I make a decision’.). As such, we tive states are more generalized and overlapping, as opposed to cannot distinguish which aspects of PT are directly related to detailed and nuanced, their associated neural patterns can be increased classifier accuracy. It is possible that participants who expected to be less differentiable. Perceived emotional actions, scorehigherontheIRI-PTaremotivatedtoengageinthetaskwith interoception and affective contexts of other individuals can all moreeffortandthattheincreasedpatternseparationisrelatedto be successfully classified using the vmPFC activity ( Oosterwijk increased effort in the task. Indeed, the IRI-PT is correlated with et al., 2017). Notably, areas where accuracy was significantly the Mind Reading Motivation scale, a measure specifically aimed 1088 Social Cognitive and Affective Neuroscience, 2022, Vol. 17, No. 12 at the motivation to think about other people’s minds (Carpenter Funding et al., 2016). The work was supported by a grant from the Templeton World Charity Foundation to A.D. and J.K. Future directions Our searchlight analysis found large regions of the brain, many Conflict of interest previously implicated in affect-related processing, that signifi- cantly classified affective states. Despite this, only the accuracy The authors declare no conflicts of interest. in the vmPFC and insula was related to PT. This leaves the ques- tion of what traits, cognitive processes, or noise factors could potentially explain individual differences in classification accu- Supplementary data racy in other regions. It is possible that other cognitive traits, such as the ability to interpret other’s affective intent or bodily Supplementary data are available at SCAN online. perception, mayexplainhowaccurateotherregionssuchastem- poral gyrus (Wicker et al., 2003) and the parietal lobule (Engelen et al., 2015) are at distinguishing affective states in MVPA stud- References ies. Furthermore, the affective states we asked participants to Aglieri, V., Cagna, B., Velly, L., Takerkart, S., Belin, P. (2021). imagine are among the most common, ‘basic emotions’. These fMRI-based identity classification accuracy in left temporal and affective experiences are most commonly described as categori- frontal regions predicts speaker recognition performance. Scien- calindailylifeandthereforemaybemoreeasilydiscriminated.It tific Reports , 11(1), 489. remainsunclearwhetherthesamemechanisms,PTandaffective Anderson, A., Han, D., Douglas, P.K., Bramen, J., Cohen, M.S. simulation in the vmPFC, would be as accurate in discriminat- (2012). Real-time functional MRI classification of brain states ing mixed feelings or if other neural regions and cognitive pro- using Markov-SVM hybrid models: peering inside the rt-fMRI cesses may be differentially important for non-typical affective black box. In: Langs, G., Rish, I., Grosse-Wentrup, M., Murphy, B., states. (editors). Machine Learning and Interpretation in Neuroimaging. New Ourstudyhighlightstheimportanceofinvestigatingtheneural York: Springer, 242–55. substrate of individual differences in PT overall, especially across Bechara, A., Damasio, A.R., Damasio, H., Anderson, S.W. (1994). domains.WhilesomebrainregionsmaybeinvolvedbroadlyinPT Insensitivitytofutureconsequencesfollowingdamagetohuman across participants, univariate approaches may miss more spe- prefrontal cortex. Cognition, 50(1–3), 7–15. cific regions that truly distinguish successful vs unsuccessful PT. Bechara, A., Damasio, H., Damasio, A.R., Lee, G.P. (1999). Dif- Inouraffectivesimulationtask,thesewerethevmPFCandinsula, ferent contributions of the human amygdala and ventromedial althoughtheselikelymaybedifferentforPTtasksinvolvingother prefrontal cortex to decision-making. The Journal of Neuroscience, domains. 19(13), 5473–81. Benoit, R.G., Davies, D.J., Anderson, M.C. (2016). Reducing future fears by suppressing the brain mechanisms underlying episodic Conclusion simulation.ProceedingsoftheNationalAcademyofSciences, 113(52), E8492–501. In our study, we found a relationship between trait PT ability and Benoit, R.G., Paulus, P.C., Schacter, D.L. (2019). Forming attitudes via the classification accuracy of an individual’s vmPFC and insu- neural activity supporting affective episodic simulations. Nature lar activity for distinguishing task-evoked affective states. The Communications, 10(1), 1–11. value of these findings is important because it shows that the Bertossi,E.,Aleo,F.,Braghittoni,D.,Ciaramelli,E.(2016).Stuckinthe discriminability of signal in these regions, which exhibit high here and now: construction of fictitious and future experiences classification accuracy among affective states overall, is associ- following ventromedial prefrontal damage. Neuropsychologia, 81, ated with a task-relevant personal trait, namely PT. More work 107–16. is needed, however, to explore what underlying functional prop- Bertossi, E., Candela, V., De Luca, F., Ciaramelli, E. (2017). Episodic erties underlie successful classification ( Anderson et al., 2012; future thinking following vmPFC damage: impaired event con- Carlson and Wardle, 2015). struction, maintenance, or narration? Neuropsychology, 31(3), Methodologically, we show that searchlight MVPA can be used to uncover the mediating traits and processes, which Blair, R., Mitchell, D., Richell, R., et al. (2002). Turning a deaf ear explain why a particular region of the brain contributes to to fear: impaired recognition of vocal affect in psychopathic classification accuracy. Connecting MVPA results directly to individuals. Journal of Abnormal Psychology, 111, 682–6. individual traits or behaviors greatly enhances their inter- Bonte, M., Hausfeld, L., Scharke, W., Valente, G., Formisano, E. pretability. These findings reflect the strength of both mul- (2014). Task-dependent decoding of speaker and vowel identity tivariate analysis and the study of individual differences: from auditory cortical response patterns. Journal of Neuroscience, both seek to gain information from what ‘differs between’ 34(13), 4548–57. participants, rather than averages and commonalities across Bradley, M.M., & Lang, P.J. (2007). The International Affective Picture participants. System(IAPS)inthestudyofemotionandattention.In:Coan,J.A, Allen,J.J.B,(editors).HandbookofEmotionElicitationandAssessment, London: Oxford University Press. pp. 29–46. Acknowledgements Bukowski, H. (2018). The neural correlates of visual perspective tak- The authors would like to thank the Brain and Creativity Insitute ing: a critical review. Current Behavioral Neuroscience Reports, 5(3), for its continued support on this project. 189–97. A. G. Vaccaro et al. 1089 Bush, K.A., Privratsky, A., Gardner, J., Zielinski, M.J., Kilts, C.D. (2018). Finn, E.S., Scheinost, D., Finn, D.M., Shen, X., Papademetris, X., Common functional brain states encode both perceived emo- Constable, R.T.(2017).Canbrainstatebemanipulatedtoempha- tion and the psychophysiological response to affective stimuli. size individual differences in functional connectivity? Neuroim- Scientific Reports , 8(1), 1–10. age, 160, 140–51. Carlson, T.A., Wardle, S.G. (2015). Sensible decoding. Neuroimage, Fotopoulou, A., Tsakiris, M. (2017). Mentalizing homeostasis: the 110, 217–8. social origins of interoceptive inference. Neuropsychoanalysis, Carpenter, J.M., Green, M.C., Vacharkulksemsuk, T. (2016). Beyond 19(1), 3–28. perspective-taking:mind-readingmotivation.MotivationandEmo- Gessell, B., Geib, B., De Brigard, F. (2021). Multivariate pattern anal- tion, 40(3), 358–74. ysis and the search for neural representations. Synthese, 199, Celeghin, A., Diano, M., Bagnis, A., Viola, M., Tamietto, M. 12869–89. (2017). Basic emotions in human neuroscience: neuroimaging Hanke, M., Halchenko, Y.O., Sederberg, P.B., Hanson, S.J., Haxby, and beyond. Frontiers in Psychology, 8, 1432. J.V., & Pollmann, S. (2009). PyMVPA: a Python toolbox for Christov-Moore, L., Iacoboni, M. (2016). Self-other resonance, its multivariate pattern analysis of fMRI data. Neuroinformatics, 7, control and prosocial inclinations: brain–behavior relationships. 37–53. Human Brain Mapping, 37(4), 1544–58. Healey, M.L., Grossman, M. (2018). Cognitive and affective Clark-Polner, E., Johnson, T.D., Barrett, L.F. (2017). Multivoxel pat- perspective-taking: evidence for shared and dissociable anatom- tern analysis does not provide evidence to support the existence ical substrates. Frontiers in Neurology, 9, 491. of basic emotions. Cerebral Cortex (New York, N.Y.: 1991), 27(3), Hill, C.L., Updegraff, J.A. (2012). Mindfulness and its relationship to 1944–8. emotional regulation. Emotion, 12(1), 81. Corradi-Dell’Acqua, C., Hofstetter, C., Vuilleumier, P. (2014). Cogni- Hynes, C.A., Baird, A.A., Grafton, S.T. (2006). Differential role of the tive and affective theory of mind share the same local patterns orbital frontal lobe in emotional versus cognitive perspective- ofactivityinposteriortemporalbutnotmedialprefrontalcortex. taking. Neuropsychologia, 44(3), 374–83. Social Cognitive and Affective Neuroscience, 9(8), 1175–84. Israelashvili, J., Oosterwijk, S., Sauter, D., Fischer, A. (2019). Knowing Coutanche, M.N. (2013). Distinguishing multi-voxel patterns and me,knowingyou:emotiondifferentiationinoneselfisassociated mean activation: why, how, and what does it tell us? Cognitive, with recognition of others’ emotions. Cognition & Emotion, 33(7), Affective & Behavioral Neuroscience, 13(3), 667–73. 1461–71. Coutanche, M.N., Thompson-Schill, S.L., Schultz, R.T. (2011). Multi- Jenkinson, M., Smith, S. (2001). A global optimisation method for voxel pattern analysis of fMRI data predicts clinical symptom robust affine registration of brain images. Medical Image Analysis, severity. Neuroimage, 57(1), 113–23. 5(2), 143–56. Critchley, H.D. (2009). Psychophysiology of neural, cognitive and Kaplan, J.T., Meyer, K. (2012). Multivariate pattern analysis reveals affective integration: fMRI and autonomic indicants. International common neural patterns across individuals during touch obser- Journal of Psychophysiology, 73(2), 88–94. vation. Neuroimage, 60(1), 204–12. Damasio, A., Carvalho, G.B.(2013).Thenatureoffeelings: evolution- Kassam, K.S., Markey, A.R., Cherkassky, V.L., Loewenstein, G., aryandneurobiologicalorigins.NatureReviewsNeuroscience,14(2), Just, M.A. (2013). Identifying emotions on the basis of neural 143–52. activation. PLoS One, 8(6), e66032. Damasio, A.R. (1996). The somatic marker hypothesis and the pos- Keysers, C., Gazzola, V. (2007). Integrating simulation and theory sible functions of the prefrontal cortex. Philosophical Transactions of mind: from self to social cognition. Trends in Cognitive Sciences, of the Royal Society of London. Series B, Biological Sciences, 351(1346), 11(5), 194–6. 1413–20. Kim, J., Schultz, J., Rohe, T., Wallraven, C., Lee, S.-W., Bülthoff, H.H. Davis, T., LaRocque, K.F., Mumford, J.A., Norman, K.A., Wagner, A.D., (2015). Abstract representations of associated emotions in the Poldrack, R.A. (2014). What do differences between multi-voxel human brain. Journal of Neuroscience, 35(14), 5655–63. and univariate analysis mean? How subject-, voxel-, and trial- Kober, H., Barrett, L.F., Joseph, J., Bliss-Moreau, E., Lindquist, K., level variance impact fMRI analysis. Neuroimage, 97, 271–83. Wager, T.D. (2008). Functional grouping and cortical–subcortical Decety, J., Grèzes, J. (2006). The power of simulation: imagining one’s interactions in emotion: a meta-analysis of neuroimaging stud- own and other’s behavior. Brain Research, 1079(1), 4–14. ies. Neuroimage, 42(2), 998–1031. Decety, J., Meltzoff, A.N. (2011). Empathy, imitation, and the social Kragel, P.A., LaBar, K.S. (2015). Multivariate neural biomarkers of brain. In: Coplin, A., Goldie, P., editors. Empathy: Philosophical and emotional states are categorically distinct. Social Cognitive and Psychological Perspectives. Oxford University Press, 58–81. Affective Neuroscience, 10(11), 1437–48. Dukes, D., Abrams, K., Adolphs, R., et al. (2021). The rise of affec- Kragel,P.A.,LaBar,K.S.(2016).Decodingthenatureofemotioninthe tivism. Nature Human Behaviour, 5, 816–20. brain. Trends in Cognitive Sciences, 20(6), 444–55. D’Argembeau, A., Xue, G., Lu, Z.-L., Van der Linden, M., Bechara, A. Kriegeskorte, N., Douglas, P.K. (2019). Interpreting encoding and (2008). Neural correlates of envisioning emotional events in the decoding models. Current Opinion in Neurobiology, 55, 167–79. near and far future. Neuroimage, 40(1), 398–407. Lamm, C., Bukowski, H., Silani, G. (2016). From shared to distinct Eckland, N.S., Leyro, T.M., Mendes, W.B., Thompson, R.J. (2018). self-other representations in empathy: evidence from neurotyp- A multi-method investigation of the association between emo- ical function and socio-cognitive disorders. Philosophical Trans- tional clarity and empathy. Emotion, 18(5), 638. actions of the Royal Society of London. Series B, Biological Sciences, Engelen, T., de Graaf, T.A., Sack, A.T., de Gelder, B. (2015). A causal 371(1686), 20150083. role for inferior parietal lobule in emotion body perception. Cor- Meyer, K., Kaplan, J.T., Essex, R., Webber, C., Damasio, H., tex, 73, 195–202. Damasio, A.(2010).Predictingvisualstimulionthebasisofactiv- Erbas, Y., Sels, L., Ceulemans, E., Kuppens, P. (2016). Feeling me, ity in auditory cortices. Nature Neuroscience, 13(6), 667–8. feeling you: the relation between emotion differentiation and Mischkowski, D., Crocker, J., Way, B.M. (2019). A social analgesic? empathic accuracy. Social Psychological and Personality Science, 7(3), Acetaminophen (Paracetamol) reduces positive empathy. Fron- 240–7. tiers in Psychology, 10, 538. 1090 Social Cognitive and Affective Neuroscience, 2022, Vol. 17, No. 12 Morawetz, C., Bode, S., Baudewig, J., Jacobs, A.M., Heekeren, H.R. Sachs, M.E., Habibi, A., Damasio, A., Kaplan, J.T. (2018). Decod- (2016).Neuralrepresentationofemotionregulationgoals.Human ing the neural signatures of emotions expressed through sound. Brain Mapping, 37(2), 600–20. Neuroimage, 174, 1–10. Naselaris, T., Kay, K.N., Nishimoto, S., Gallant, J.L. (2011). Encoding Scarantino, A. (2012). Functional specialization does not require a and decoding in fMRI. Neuroimage, 56(2), 400–10. one-to-onemappingbetweenbrainregionsandemotions.Behav- Nummenmaa, L., Saarimaki, H. (2019). Emotions as discrete pat- ioral and Brain Sciences, 35(3), 161. terns of systemic activity. Neuroscience Letters, 693, 3–8. Schacter, D.L., Benoit, R.G., Szpunar, K.K. (2017). Episodic future Ochsner,K.N.,Knierim,K.,Ludlow,D.H.,etal.(2004).Reflectingupon thinking:mechanismsandfunctions.CurrentOpinioninBehavioral feelings: anfMRIstudyofneuralsystemssupportingtheattribu- Sciences, 17, 41–50. tion of emotion to self and other. Journal of Cognitive Neuroscience, Sebastian, C.L., Fontaine, N.M., Bird, G., et al. (2012). Neural pro- 16(10), 1746–72. cessingassociatedwithcognitiveandaffectiveTheoryofMindin Ogg, M., Moraczewski, D., Kuchinsky, S.E., Slevc, L.R. (2019). Sepa- adolescents and adults. Social Cognitive and Affective Neuroscience, rable neural representations of sound sources: speaker identity 7(1), 53–63. and musical timbre. Neuroimage, 191, 116–26. Seitz, R.J., Nickel, J., Azari, N.P. (2006). Functional modularity of Oosterwijk, S., Snoek, L., Rotteveel, M., Barrett, L.F., Scholte, H.S. the medial prefrontal cortex: involvement in human empathy. (2017). Shared states: using MVPA to test neural overlap between Neuropsychology, 20(6), 743. self-focused emotion imagery and other-focused emotion under- Singer, T., Seymour, B., O’doherty, J., Kaube, H., Dolan, R.J., Frith, C.D. standing.SocialCognitiveandAffectiveNeuroscience,12(7),1025–35. (2004). Empathy for pain involves the affective but not sensory Paquette, S., Takerkart, S., Saget, S., Peretz, I., Belin, P. (2018). components of pain. Science, 303(5661), 1157–62. Cross-classification of musical and vocal emotions in the audi- Skerry, A.E., Saxe, R. (2014). A common neural code for per- tory cortex. Annals of the New York Academy of Sciences, 1423, ceived and inferred emotion. Journal of Neuroscience, 34(48), 329–37. 15997–6008. Parkinson, B., Manstead, A.S. (2015). Current emotion research in Smith, R., Lane, R.D. (2015). The neural basis of one’s own conscious social psychology: thinking about emotions and other people. and unconscious emotional states. Neuroscience and Biobehavioral Emotion Review, 7(4), 371–80. Reviews, 57, 1–29. Peelen, M.V., Atkinson, A.P., Vuilleumier, P. (2010). Supramodal rep- Smith, S.M., Jenkinson, M., Woolrich, M.W., et al. (2004). Advances resentationsofperceivedemotionsinthehumanbrain.Journalof in functional and structural MR image analysis and implemen- Neuroscience, 30(30), 10127–34. tation as FSL. NeuroImage, 23(S1): 208–19. Poppa, T., Bechara, A. (2018). The somatic marker hypothesis: Tettamanti,M.,Rognoni,E.,Cafiero,R.,Costa,T.,Galati,D.,Perani, D. revisiting the role of the ‘body-loop’ in decision-making. Current (2012). Distinct pathways of neural coupling for different basic Opinion in Behavioral Sciences, 19, 61–6. emotions. Neuroimage, 59(2), 1804–17. Raizada, R.D., Kriegeskorte, N. (2010). Pattern-information fMRI: Thompson, R.J., Boden, M.T. (2019). State emotional clarity and new questions which it opens up, and challenges which face it. attention to emotion: a naturalistic examination of their asso- International Journal of Imaging Systems Technology, 20, 31–41. ciations with each other, affect, and context. Cognition & Emotion, Raizada, R.D., Tsao, F.M., Liu, H.M., Holloway, I.D., Ansari, D., 33(7), 1514–22. Kuhl, P.K. (2010). Linking brain-wide multivoxel activation pat- Vaccaro, A.G., Kaplan, J.T., Damasio, A. (2020). Bittersweet: the neu- terns to behaviour: examples from language and math. Neuroim- roscienceofambivalentaffect.PerspectivesonPsychologicalScience, age, 51(1), 462–71. 15(5), 1187–99. Ritchie, J.B., Kaplan, D.M., Klein, C. (2019). Decoding the brain: neu- van der Heiden, L., Scherpiet, S., Konicar, L., Birbaumer, N., ralrepresentationandthelimitsofmultivariatepatternanalysis Veit, R. (2013). Inter-individual differences in successful perspec- in cognitive neuroscience. The British Journal for the Philosophy of tive taking during pain perception mediates emotional respon- Science, 70(2), 581–607. siveness in self and others: an fMRI study. Neuroimage, 65, Roy, M., Shohamy, D., Wager, T.D. (2012). Ventromedial prefrontal- 387–94. subcortical systems and the generation of affective meaning. Wicker, B., Perrett, D.I., Baron-Cohen, S., Decety, J. (2003). Being the Trends in Cognitive Sciences, 16(3), 147–56. target of another’s emotion: a PET study. Neuropsychologia, 41(2), Rütgen, M., Seidel, E.-M., Silani, G., et al. (2015). Placebo analge- 139–46. sia and its opioidergic regulation suggest that empathy for pain Winecoff, A., Clithero, J.A., Carter, R.M., Bergman, S.R., is grounded in self pain. Proceedings of the National Academy of Wang, L., Huettel, S.A. (2013). Ventromedial prefrontal cor- Sciences, 112(41), E5638–46. tex encodes emotional value. Journal of Neuroscience, 33(27), Rütgen, M., Wirth, E.-M., Riecanský, I., et al. (2021). Beyond sharing 11032–9. unpleasant affect—evidence for pain-specific opioidergic modu- Winkler, A.M., Ridgway, G.R., Webster, M.A., Smith, S.M., Nichols, lation of empathy for pain. Cerebral Cortex, 31(6), 2773–86. T.E. (2014). Permutation inference for the general linear model. Saarimaki, H., Ejtehadian, L.F., Glerean, E., et al. (2018). Distributed Neuroimage, 92, 381–97. affective space represents multiple emotion categories across Zhou,F.,Li,J.,Zhao,W.,etal.(2020).Empathicpainevokedbysensory the human brain. Social Cognitive and Affective Neuroscience, 13(5), andemotional-communicativecuessharecommonandprocess- 471–82. specific neural representations. Elife, 9, e56929. Saarimaki, H., Gotsopoulos, A., Jaaskelainen, I.P., et al. (2016). Dis- Zhou, F., Zhao, W., Qi, Z., et al. (2021). A distributed fMRI-based crete neural signatures of basic emotions. Cerebral Cortex, 26(6), signature for the subjective experience of fear. Nature Communi- 2563–73. cations, 12(1), 1–16.

Journal

Social Cognitive and Affective NeuroscienceOxford University Press

Published: May 17, 2022

Keywords: emotion; perspective-taking; multivariate-pattern analysis; ventromedial prefrontal cortex

There are no references for this article.