Skip to main content

Sound by Sound towards Words – LOLA Makes Language Learning Visible

Dr. Aude Noiray explains a mobile ultrasound device for testing babies. | Photo: Tobias Hopfgarten
Dr. Aude Noiray explains the ultrasonic measuring device. | Photo: Tobias Hopfgarten
Dr. Aude Noiray explains the ultrasonic measuring device. | Photo: Tobias Hopfgarten
Photo : Tobias Hopfgarten
Dr. Aude Noiray explains a mobile ultrasound device for testing babies.
Photo : Tobias Hopfgarten
Dr. Aude Noiray explains the ultrasonic measuring device.
Photo : Tobias Hopfgarten
Dr. Aude Noiray explains the ultrasonic measuring device.
Human language is a fairly complex thing. But the same applies to the way it is spoken. Lungs, vocal cords, tongue, lips - many “tools” in our body must cooperate precisely so that in the end a simple word leaves our mouth. The linguist Dr. Aude Noiray is interested in this mechanism, but above all in how children learn the motor, lexical and phonological skills necessary to speak their native language fluently. To carry out this research, she founded the “Laboratory for Oral Language Acquisition”, or LOLA in short. Here, she uses the most modern scientific methods to uncover how spoken language is acquired, its solution being actually on the tip of our tongue and yet so difficult to decipher.

In the beginning was the Word? It’s still an empirical question. When babies start communicating with their caretakers, they may first produce simple vowel-like sounds. A simple coo that may even sound like screaming is sometimes the first step into oral communication: “Little by little, they then learn to coordinate their lungs with the speech articulators such as the jaw, lips, and tongue,” Noiray explains. “They are exploring the possibilities of their new tools. And they seem to have a lot of fun doing it! “Only when they have become familiar with these processes do babies begin to combine sounds with one another. Dadada, bababa - for parents, their children’s babbling brings great happiness and they often perceive for the first time that someone is practicing speaking. “It has been shown that this transition begins in many children, regardless of the languages they grow up with, around six to ten months of age,” Noiray says.

Looking into babies’ mouths

Aude Noiray has long been fascinated by the “attunement” of the motor system to language. She, however, faced two problems: Until a decade ago, it was difficult to examine what happened in the child’s mouth. “We owe most of what we know about child articulation today to the acoustic analysis or transcriptions of sound recordings,” she explains. To observe the oral cavity non-invasively, she combined established audio and video analyses with ultrasound imaging technology that had hardly been used in linguistic research on young children and never on babies. Since there were not any method for experimentally examining the vocal tract using ultrasound imaging, Noiray and her team designed the method themselves: the Sonographic and Optical Linguo-Labial Articulatory Recording system, or SOLLAR in short. They mounted an ultrasound head, the so-called transducer, which emits the ultrasound waves, on a spring-mounted frame. The participants place their chin on the sensor head so that it moves up and down with the lower jaw when speaking. As a result, it is below the tongue and can record an optimal picture of the tongue movement while children speak. The ultrasound data are transferred to a monitor, as known from medical examinations, and of course, the data are saved for evaluation.

The device has already proven its value in various projects for language development in young children. “We were not sure whether we could examine three-year-olds,” says the researcher. “One of the biggest challenges was to stimulate their attention. We have created a space journey to the stars as a background story. We brainstormed to develop ideas, optimized the process - and it worked!”

Noiray and her team now go one step further - in fact a step back. “When I wrote my first research proposal for a project with three-year-olds, colleagues told me, 'This will never work.' Today this has become completely normal. Thanks to the technical possibilities and the experience we built with young children, we are able to start at an increasingly earlier stage in language development. And that starts in the first months of life.” In a current project, funded by the German Research Foundation (DFG), the LOLA researchers are investigating the phase during which the first dadada develops out of curious cooing, and simple sounds turn into the first attempts at spoken language articulation. “At the moment, we are primarily interested in what influence the first speech attempts of babies, when and how they process what is articulated by others.” The starting point of the investigation is an observation: During the first six months of life, children mainly look at the eyes of those around them. But at some point, between the eighth and tenth month, research has shown they focus more frequently on people’s mouth, especially when the mouth is moving. “In fact, some babies even look at the mouth even before the first sound is produced - as if they were anticipating and understanding, 'This is an important place that can help me learn to speak myself,'" says Noiray.

Reading language from the mouth

The project aims at finding out whether there is a connection between the children’s change of gaze from the eyes to the mouth area and their spoken language development. “It is quite conceivable that babies who start watching the mouth very early are more likely to start babbling earlier,” says Noiray. The main challenge in this study is not just to observe what happens in the mouth but also the eyes. In addition to audio, video and ultrasound analyses the team also integrated eye tracking to record the eye movements into the experiment. And again, the little test subjects prove to be a great challenge, as Noiray says. “Some children are distracted very quickly; others are fascinated by the ultrasound screen that they look there or notice the transducer and no longer look at the video presented. Then the recordings are of course useless. We also can’t keep most of them at it for more than five minutes. That’s really short - and has forced us to adapt the whole procedure.” All babies are “warmed up” together with their parents in a 30-minute playing phase before they are shown videos in which people begin to speak at some point. If the young participants allow it, their spoken communication is extensively documented – with a video camera, the ultrasound device, and a microphone. As soon as enough data is collected, the researchers will evaluate it. “We are curious which connections we will find,” says Noiray. “Does the gaze ‘shift’ from eyes to the mouth indicate the beginning of a new developmental stage of speech production? Or is it rather the other way around - the children first start babbling, then discover the mouth of their interlocutor and prefer looking at the mouth instead of the eyes?”

So far, about 100 babies have been recorded, some even several times over a period of months. “A real stroke of luck,” Noiray says. “Because we can follow their language development in detail and relate it to the findings of our study.” This is also very important for potential applications of the research results, for example when treating language development disorders. “The more we know about how babies learn to speak, the better we can deal with delays or even disruptions to this development - perhaps even before they have a negative impact.”

The Researchers

Dr. Aude Noiray studied English, Language, Letter and Foreign Civilisation as well as Language Sciences at the Université Stendhal, Grenoble (France). Since 2012, she has been researching at the University of Potsdam – and is the group leader of LOLA, which was founded in January 2015.
Mail: anoirayuni-potsdamde

The Lab

The researchers of the Laboratory for Oral Language Acquisition (LOLA) research the development of language - from infancy to puberty. Research topics include the development of speech motor control, perceptual, phonological, and lexical development, the relationship between language planning and language production, and reading comprehension. In their experiments, they work with audio and video recordings, eye tracking, ultrasound and various speech and reading evaluations.

https://www.uni-potsdam.de/lola/index.html

 

Dr. Aude Noiray gives a short introduction to the current research in LOLA in this video.

 

This text was published in the university magazine Portal Wissen - One 2020 „Energy“.