Quantifying the adequacy of neural representations for a cross-language phonetic discrimination task: Prediction of individual differences

Rajeev D.S. Raizada, Feng Ming Tsao, Huei Mei Liu, Patricia K. Kuhl

Research output: Contribution to journalArticle

54 Citations (Scopus)

Abstract

In order for stimuli to be perceptually discriminable, their representations in the brain must be distinct. Investigating the task of discriminating the syllables /ra/ and /la/, we hypothesized that the more distinct a person's neural representations of those sounds were, the better their behavioral ability to discriminate them would be. Standard neuroimaging approaches are ill-suited to testing this hypothesis as they have problems differentiating between neural representations spatially intermingled within the same brain area. We therefore performed multi-voxel pattern-based analysis of the functional magnetic resonance imaging (fMRI) activity elicited by these syllables, in native speakers of English and Japanese. In right primary auditory cortex, the statistical separability of these fMRI patterns predicted subjects' behavioral ability to tell the sounds apart, not only across groups but also across individuals. This opens up a new approach for identifying neural representations and for quantifying their task suitability.

Original languageEnglish
Pages (from-to)1-12
Number of pages12
JournalCerebral Cortex
Volume20
Issue number1
DOIs
Publication statusPublished - 2010 Jan 1

Fingerprint

Phonetics
Aptitude
Individuality
Language
Magnetic Resonance Imaging
Auditory Cortex
Brain
Population Groups
Neuroimaging

Keywords

  • Auditory cortex
  • Classifier
  • Nonnative language
  • Pattern-based fMRI
  • SVM

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Cellular and Molecular Neuroscience

Cite this

Quantifying the adequacy of neural representations for a cross-language phonetic discrimination task : Prediction of individual differences. / Raizada, Rajeev D.S.; Tsao, Feng Ming; Liu, Huei Mei; Kuhl, Patricia K.

In: Cerebral Cortex, Vol. 20, No. 1, 01.01.2010, p. 1-12.

Research output: Contribution to journalArticle

@article{0ed887fba1c845c2a5846ca31c9e4e8b,
title = "Quantifying the adequacy of neural representations for a cross-language phonetic discrimination task: Prediction of individual differences",
abstract = "In order for stimuli to be perceptually discriminable, their representations in the brain must be distinct. Investigating the task of discriminating the syllables /ra/ and /la/, we hypothesized that the more distinct a person's neural representations of those sounds were, the better their behavioral ability to discriminate them would be. Standard neuroimaging approaches are ill-suited to testing this hypothesis as they have problems differentiating between neural representations spatially intermingled within the same brain area. We therefore performed multi-voxel pattern-based analysis of the functional magnetic resonance imaging (fMRI) activity elicited by these syllables, in native speakers of English and Japanese. In right primary auditory cortex, the statistical separability of these fMRI patterns predicted subjects' behavioral ability to tell the sounds apart, not only across groups but also across individuals. This opens up a new approach for identifying neural representations and for quantifying their task suitability.",
keywords = "Auditory cortex, Classifier, Nonnative language, Pattern-based fMRI, SVM",
author = "Raizada, {Rajeev D.S.} and Tsao, {Feng Ming} and Liu, {Huei Mei} and Kuhl, {Patricia K.}",
year = "2010",
month = "1",
day = "1",
doi = "10.1093/cercor/bhp076",
language = "English",
volume = "20",
pages = "1--12",
journal = "Cerebral Cortex",
issn = "1047-3211",
publisher = "Oxford University Press",
number = "1",

}

TY - JOUR

T1 - Quantifying the adequacy of neural representations for a cross-language phonetic discrimination task

T2 - Prediction of individual differences

AU - Raizada, Rajeev D.S.

AU - Tsao, Feng Ming

AU - Liu, Huei Mei

AU - Kuhl, Patricia K.

PY - 2010/1/1

Y1 - 2010/1/1

N2 - In order for stimuli to be perceptually discriminable, their representations in the brain must be distinct. Investigating the task of discriminating the syllables /ra/ and /la/, we hypothesized that the more distinct a person's neural representations of those sounds were, the better their behavioral ability to discriminate them would be. Standard neuroimaging approaches are ill-suited to testing this hypothesis as they have problems differentiating between neural representations spatially intermingled within the same brain area. We therefore performed multi-voxel pattern-based analysis of the functional magnetic resonance imaging (fMRI) activity elicited by these syllables, in native speakers of English and Japanese. In right primary auditory cortex, the statistical separability of these fMRI patterns predicted subjects' behavioral ability to tell the sounds apart, not only across groups but also across individuals. This opens up a new approach for identifying neural representations and for quantifying their task suitability.

AB - In order for stimuli to be perceptually discriminable, their representations in the brain must be distinct. Investigating the task of discriminating the syllables /ra/ and /la/, we hypothesized that the more distinct a person's neural representations of those sounds were, the better their behavioral ability to discriminate them would be. Standard neuroimaging approaches are ill-suited to testing this hypothesis as they have problems differentiating between neural representations spatially intermingled within the same brain area. We therefore performed multi-voxel pattern-based analysis of the functional magnetic resonance imaging (fMRI) activity elicited by these syllables, in native speakers of English and Japanese. In right primary auditory cortex, the statistical separability of these fMRI patterns predicted subjects' behavioral ability to tell the sounds apart, not only across groups but also across individuals. This opens up a new approach for identifying neural representations and for quantifying their task suitability.

KW - Auditory cortex

KW - Classifier

KW - Nonnative language

KW - Pattern-based fMRI

KW - SVM

UR - http://www.scopus.com/inward/record.url?scp=72049106962&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=72049106962&partnerID=8YFLogxK

U2 - 10.1093/cercor/bhp076

DO - 10.1093/cercor/bhp076

M3 - Article

C2 - 19386636

AN - SCOPUS:72049106962

VL - 20

SP - 1

EP - 12

JO - Cerebral Cortex

JF - Cerebral Cortex

SN - 1047-3211

IS - 1

ER -