Even newborns recognize complex acoustic patterns
Published: Friday, Oct 25th 2024, 09:30
العودة إلى البث المباشر
Humans are born with highly developed auditory abilities. This was shown by linguists in the journal "PLOS Biology". According to the study, even newborns can recognize sequences of sounds with non-adjacent signals that follow rules - as in language. Such complex acoustic patterns are likely to activate language-relevant networks in the brain at an early age.
Various studies have shown "the impressive abilities" of babies in the auditory field, as a linguist from the University of Vienna, together with colleagues from Zurich and Tokyo, explained in the journal.
Babies can therefore learn consecutive sounds or syllables. From the age of five months, they are able to identify rules in more complex sound sequences. This is particularly important for language acquisition, as rules for non-adjacent acoustic elements enable meaningful connections over a certain distance. This allows complex and hierarchical sentence structures to be formed.
Jutta Mueller from the Institute of Linguistics at the University of Vienna cites the sentence "I know you didn't do your homework" as an example of this in a press release. In this sentence, the pronoun "du" and the grammatical ending "-st" are related to each other.
Recognize rules
There have been many indications that the ability to recognize such rules between non-adjacent acoustic signals - whether as a tone sequence or in the form of speech - is innate. However, decisive proof of this has so far been lacking.
Together with the team led by Simon Townsend from the University of Zurich and Yasuyo Minagawa from Keio University in Tokyo, Mueller used near-infrared spectroscopy to observe the brain activity of newborns and six-month-old babies when listening to complex tone sequences. The one to five-day-old children were played special tone sequences for six minutes.
In each of the sequences, a first tone was linked to a non-adjacent third tone. The babies then heard tone sequences with the same pattern, only at a different pitch. The tone sequence was correct or contained errors in the sequence.
"After a short learning phase, the brain activity of the newborns showed significant differences between correct and incorrect tone sequences in the left area of the prefrontal cortex," Mueller explained to the APA. The brain can therefore recognize these differences and react to complex patterns such as speech right from the start.
The extent to which the prefrontal cortex located behind the forehead reacted to incorrect tone sequences was linked to the activation of networks in the left hemisphere of the brain. These brain regions, such as the supramarginal gyrus (SMG) or the superior temporal gyrus (STG), are also important for speech processing.
Language-relevant networks activated
The researchers conclude that complex tone sequences activate language-relevant networks from the very beginning, which stabilize and react more specifically over the course of the first months of life. "The networking of brain areas during the learning process indicates that early learning experiences could be crucial for the formation of networks that later support the processing of complex acoustic patterns," says Mueller.
The new findings on the importance of complex environmental stimuli for brain development are particularly relevant when important stimuli are missing or cannot be processed well, for example in premature babies. Because non-linguistic acoustic signals can obviously also address language-relevant networks in the brain, this would also open up opportunities for early language development, for example through music.
©كيستون/إسدا