Understanding how brains learn speech requires starting from before babies have even uttered their first word, according to UBC experts.
Dr. Janet Werker, University Killam professor and Canada research chair in psychology, and Dr. Alexis Black, an associate professor in the School of Audiology, research how infants acquire language. Black said a focus on early life is essential as language acquisition requires us to “make the line all the way up [from babies] to adulthood.
Babies interaction with their environment slowly helps develop their ability to babble and eventually speak coherent sentences.
Werker researches infants and their developing communication systems. Her work suggests the brain begins learning language before birth, with newborn brains responding differently to languages they had been surrounded by when in the womb.
According to Werker, the responses differ in monolingual and multilingual babies. Newborns do not have any sophisticated communication method, but they do have a sucking reflex. Werker’s study found babies suck pacifiers stronger and faster when they hear the same languages they did before birth.
“Monolingual babies … will suck more to listen to the language that they’ve been hearing [in the womb] and bilingual exposed babies will listen more to both of the languages that they heard,” said Werker.
Werker noted speech perception is multi-sensory, saying “babies not only listen to but also watch your mouth when you’re talking, and encode and represent that.”
Word to the young
Black completed her PhD at UBC where she focused on statistical learning which is “the capacity that organisms … have for detecting structure in our environment.” Applying this to linguistics, she looked at how babies pick up sound patterns when listening to speech.
One of her studies exposed babies to an artificial language and recorded their brain waves using a electroencephalogram (EEG).
Neural oscillations initially occured at every input sound but, after about two minutes, “the brain also started having slower oscillations at [multi-syllabic] rates.”
This indicated that babies were starting to find implicit patterns and shift from perceiving isolated syllables to recognizing words.
In the brain
Two major brain areas are implicated in language processing: Broca’s area is associated with speech production and Wernicke’s area is responsible for language development and comprehension. Connections between these regions strengthen over time, helping children develop language acquisition abilities.
Black uses tech-savvy methods to understand what goes on in the brain as babies learn to communicate. She primarily uses EEGs and functional near-infrared spectroscopy (fNIRS) to ask different questions about how the brain acquires language.
Werker explained that fNIRS is “the least invasive and the safest and easiest to use with infants.” The fNIRS gives insight into brain activity by detecting the absorption of near-infrared light by hemoglobin.
These techniques serve a fundamental purpose in helping researchers speak to the complexity of human language.
“I want to understand what our biases and preferences are as a species as soon as we get started, and that’s why I study language, because that’s critical to humans,” said Werker.
This article is part of The Ubyssey's neuroscience supplement, Big Brain Time. Pick up our latest print issue on campus to read the full supplement.