Finishing Each Other’s Sentences: How the brain predicts speech

Finishing Each Other’s Sentences: How the brain predicts speech

Similar to predictive texting on your cell phone, our brains also predict what’s coming next when someone is speaking. How do our brains make these predictions? A new PLOS Biology study reveals the mechanisms in the brain’s auditory cortex involved in processing speech and predicting upcoming words, and suggests that this process hasn’t changed throughout evolution.

Using an approach first developed for studying infant language learning, a team of neuroscientists led by Yuki Kikuchi and Chris Petkov of Newcastle University had humans and monkeys listen to sequences of spoken words from a made-up language. Both species were able to learn the predictive relationships between the spoken sounds in the sequences.

Neural responses from the auditory cortex in the two species revealed how populations of neurons responded to the speech sounds and to the learned predictive relationships between the sounds. The neural responses were found to be remarkably similar between species, suggesting that the way the human auditory cortex responds to speech harnesses evolutionarily conserved mechanisms, rather than those that have uniquely specialized in humans for speech or language.

“Being able to predict events is vital for so much of what we do every day,” Petkov notes. “Now that we know humans and monkeys share the ability to predict speech we can apply this knowledge to take forward research to improve our understanding of the human brain.”

Kikuchi elaborates, “In effect we have discovered the mechanisms for speech in your brain that work like predictive text on your mobile phone, anticipating what you are going to hear next. This could help us better understand what is happening when the brain fails to make fundamental predictions, such as in people with dementia or after a stroke.”

Building on these results, the team is working on projects to harness insights on predictive signals in the brain. These insights could aid development of new models to study how such signals go wrong in patients with stroke or dementia. The long-term goal is to identify strategies that yield more accurate prognoses and treatments for these patients.

Reference: Kikuchi Y, Attaheri A, Wilson B, Rhone AE, Nourski KV, Gander PE, et al. (2017) Sequence learning modulates neural responses and oscillatory coupling in human and monkey auditory cortex. PLoS Biol 15(4): e2000219. doi:10.1371/journal.pbio.2000219

Image Credit: Dr Y. Kikuchi et al. doi:10.1371/journal.pbio.2000219

 

Author

Tessa is an Editorial Media Associate at PLOS. She graduated from the University of California, Berkeley with degrees in Rhetoric and Music. She can be reached by email at tgregory@plos.org and on Twitter at @tessagregs.

Leave a Reply

Your email address will not be published. Required fields are marked *