Show simple item record

dc.contributor.author
Roswandowitz, Claudia
dc.contributor.author
Swanborough, Huw
dc.contributor.author
Frühholz, Sascha
dc.date.accessioned
2021-03-05T11:09:56Z
dc.date.available
2020-12-23T08:33:16Z
dc.date.available
2020-12-23T10:07:15Z
dc.date.available
2021-03-05T11:08:54Z
dc.date.available
2021-03-05T11:09:56Z
dc.date.issued
2021-04-01
dc.identifier.other
10.1002/hbm.25309
en_US
dc.identifier.uri
http://hdl.handle.net/20.500.11850/458293
dc.identifier.doi
10.3929/ethz-b-000458293
dc.description.abstract
Voice signals are relevant for auditory communication and suggested to be processed in dedicated auditory cortex (AC) regions. While recent reports highlighted an additional role of the inferior frontal cortex (IFC), a detailed description of the integrated functioning of the AC–IFC network and its task relevance for voice processing is missing. Using neuroimaging, we tested sound categorization while human participants either focused on the higher‐order vocal‐sound dimension (voice task) or feature‐based intensity dimension (loudness task) while listening to the same sound material. We found differential involvements of the AC and IFC depending on the task performed and whether the voice dimension was of task relevance or not. First, when comparing neural vocal‐sound processing of our task‐based with previously reported passive listening designs we observed highly similar cortical activations in the AC and IFC. Second, during task‐based vocal‐sound processing we observed voice‐sensitive responses in the AC and IFC whereas intensity processing was restricted to distinct AC regions. Third, the IFC flexibly adapted to the vocal‐sounds' task relevance, being only active when the voice dimension was task relevant. Forth and finally, connectivity modeling revealed that vocal signals independent of their task relevance provided significant input to bilateral AC. However, only when attention was on the voice dimension, we found significant modulations of auditory‐frontal connections. Our findings suggest an integrated auditory‐frontal network to be essential for behaviorally relevant vocal‐sounds processing. The IFC seems to be an important hub of the extended voice network when representing higher‐order vocal objects and guiding goal‐directed behavior.
en_US
dc.format
application/pdf
en_US
dc.language.iso
en
en_US
dc.publisher
Wiley
en_US
dc.rights.uri
http://creativecommons.org/licenses/by-nc/4.0/
dc.subject
Auditory-frontal network
en_US
dc.subject
DCM
en_US
dc.subject
Decision-making
en_US
dc.subject
fMRI
en_US
dc.subject
Voice
en_US
dc.title
Categorizing human vocal signals depends on an integrated auditory‐frontal cortical network
en_US
dc.type
Journal Article
dc.rights.license
Creative Commons Attribution-NonCommercial 4.0 International
dc.date.published
2020-12-08
ethz.journal.title
Human Brain Mapping
ethz.journal.volume
42
en_US
ethz.journal.issue
5
en_US
ethz.pages.start
1503
en_US
ethz.pages.end
1517
en_US
ethz.version.deposit
publishedVersion
en_US
ethz.identifier.wos
ethz.identifier.scopus
ethz.publication.place
Hoboken, NJ
en_US
ethz.publication.status
published
en_US
ethz.date.deposited
2020-12-23T08:33:20Z
ethz.source
WOS
ethz.eth
yes
en_US
ethz.availability
Open access
en_US
ethz.rosetta.installDate
2021-03-05T11:09:08Z
ethz.rosetta.lastUpdated
2022-03-29T05:38:30Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Categorizing%20human%20vocal%20signals%20depends%20on%20an%20integrated%20auditory%E2%80%90frontal%20cortical%20network&rft.jtitle=Human%20Brain%20Mapping&rft.date=2021-04-01&rft.volume=42&rft.issue=5&rft.spage=1503&rft.epage=1517&rft.au=Roswandowitz,%20Claudia&Swanborough,%20Huw&Fr%C3%BChholz,%20Sascha&rft.genre=article&rft_id=info:doi/10.1002/hbm.25309&
 Search print copy at ETH Library

Files in this item

Thumbnail

Publication type

Show simple item record