In contrast to the traditional view of newborns, passively lying around and crying, a recent study published in Nature Human Behaviour established that newborns start soaking up and tuning into the specifics of the world around them within hours, including the specific languages that they’ll speak.
Babies are known to start learning language by hearing speech even when they’re in the womb, but can’t hear the detail as it’s muffled, as if underwater.
The study, with international contributors including Gary Oppenheim and Guillaume Thierry of Bangor University’s School of Human and Behavioral Sciences, worked with newborns, starting within just minutes of their birth using a combination of vowels played forward (that is, naturally) and played backwards (a time-reversed version of the sound).
Using optical imaging, a non-invasive form of neuroimaging, to measure changes in the body, the process involved shining tiny torches (i.e., flashlights) at the babies’ scalps. The light shines into the body, and some bounces back and depending on what’s going on in the body (e.g., how much oxygenated blood is in an area of the brain), a little more or a little less light will bounce back.
To obtain accurate results, multiple torches were used, with their power and placement precisely controlled, as well as very precise light detectors to measure tiny changes in how much light bounces off.
Recordings of spoken vowels were played and then tested to see whether their brains responded differently when they heard these same vowels being played backward versus forward. In the first test, the babies could not distinguish between forward and backward vowels, as it is a very subtle contrast (even adults fail such discrimination test 70% of the time).
After merely five hours of exposure to this contrast, optical imaging showed that the newborns’ brains started distinguishing between the two sounds. And after a further two hours, during which the newborns mostly slept, the exposure to the vowel contrast triggered a spurt of connectivity, with neurons talking to each other on a large scale, as if they had been inspired by the language sounds they heard.
Guillaume Thierry, professor of cognitive neuroscience, said, “Our research showed that a very subtle distinction—even for the adult ear—is enough to trigger a significant brain activity surge in the newborn’s brain, showing that early experiences have potentially major consequences for cognitive development.
“In other words, we should challenge the myth that babies are mostly unaware of their environment until after a few weeks, simply because they sleep a lot, and pay attention to what babies are exposed to from the moment when they are born.”
Gary Oppenheim, lecturer in psychology, added, “When my son was born, I was surprised to see that he was immediately alert, his eyes wide open and looking around to soak in information about his strange new environment (even though a newborn’s vision is known to be quite poor). The work that a newborn’s ears and auditory system are doing isn’t as obvious to the naked eye, but this spectacular result shows we have remarkable sensitivity to language information from the very moment we are born and we immediately set to work developing and refining it in response to our experiences in the world, even when we appear to be just sleeping.”
Bangor University