It penetrates our consciousness, makes us tap our feet, and determines the pulse of our words. Music and language cannot do without it: rhythm. But how exactly do people perceive rhythm? What happens in the brain when we listen to music? Are melodic rhythms processed differently than linguistic ones? Dr. Alan Langus, Marie Curie Fellow at the Linguistics Department at the University of Potsdam, is looking for answers to these questions.
“Don't stop me now!,” Freddy Mercury sings. We involuntarily follow the beat: bam, bam-bam, bam. There it is - the rhythm. We can feel it, tap it on the edge of the table. Most people can easily access rhythm in pieces of music. It is just as present in speaking, although perhaps less consciously. Cognitive scientist Alan Langus is convinced that language and music have a lot in common when it comes to rhythm. He is interested in how far speech and music rhythms are similarly processed in the brain. Does it happen in the same areas of the brain? What about the ability to synchronize our movements with rhythms? Is the cognitive processing of rhythmic patterns different in babies than in adults? Langus gets to the bottom of these and other questions in the project "Rhythmsync: Rhythm synchronization between music and spoken language" funded by the European Union.
For this purpose, over 100 young adults and 100 six to seven-month-old babies took part in an eye-tracking experiment developed by Langus at the BabyLab led by Prof. Barbara Höhle. The cognitive scientist made use of a fascinating phenomenon for this experiment. “I found that the pupils dilate and contract synchronously with the played rhythm. This enables us to measure how people perceive rhythm.” This is a groundbreaking discovery that opens up new research paths. In the experiment, however, the researcher did not play whole sentences or composed music to the test subjects. Instead, they heard so-called artificial stimuli - individual computer-generated syllables or notes of different lengths. During the experiment, the test subjects looked at a screen so that the eyetracker could focus on their pupils. With 120 images per second, it recorded every change. Since babies cannot keep still for very long, for them the experiment was usually limited to 5-10 minutes. Adults were tested for 40 minutes. As became clear later, the data was so robust that 20 minutes would have been enough because the pupils always widened and narrowed exactly to the rhythm of the sounds played. Langus concludes that the brain equally synchronizes with speech and music. The brain seems to process rhythms in speech and music in the same way. Langus is sure, “It is irrelevant to the brain where the rhythm comes from. The brain always perceives it in the same way regardless of the transmitted information or the source.” However, it is not yet clear where exactly the rhythm occurs in the course of the speech or melody. The researcher assumes that it is carried by vowels, i.e. that it begins shortly before a vowel. When singing, a note works like a vowel as a rhythmic anchor. This is indicated by the behavior of the pupils.
But why do pupils change synchronously with the rhythm? In the experiment, Langus discovered that they get bigger when the beat changes unexpectedly. It has been known for about 60 years that pupils dilate in case of fear or strong feelings. Due to the dilation more light can penetrate the eye, enabling a better overview of the situation, explains the researcher. “The same physiological process can be observed with rhythmic changes. Another beat is perceived and the brain tries to clarify this change through the eye.” It is a basic process that already exists in early childhood. The experiment showed no significant differences between babies and adults. The gender does not seem to have any discernible influence on the processing either. The mother tongue, however, plays an important role. “A special feature of the German language is that it stresses words to distinguish them. It is therefore assumed that Germans recognize different rhythms more easily than, for example, the French or Hungarians, in whose languages stress does not play the same role,” Langus says.
Langus is convinced that rhythm has a big impact on human behavior. Conversation partners, for example, unconsciously imitate the rhythm of the other person’s words and try to maintain their rhythm while speaking alternately. This indicates harmony with the interlocutor. In his project “Rhythmsync”, Langus has already made many important discoveries for rhythm research. His research, however, continues because some exciting questions still need to be answered. Is the synchronization of rhythm and brain more precise in musical people than in less musical ones? Does it work the same way with natural stimuli as it does with artificial ones? What influence do rhythm patterns, tempo, music genre, and different ways of speaking have?
Dr. Alan Langus, born in Estonia, studied cognitive sciences and psychology in Bremen and Amsterdam. He earned his doctorate at the Scuola Internazionale Superiore di Studi Avanzati in Trieste, Italy. Since July 2016, he has been a postdoc researcher at the Linguistics Department of the University of Potsdam.
Rhythmsync: Rhythm synchronization between music and spoken language
Duration: April 2017– March 2019
Funding: European Union (Horizon 2020)
This text was published in the university magazine Portal Wissen - One 2020 „Energy“.