Everyone knows that languages you don’t speak well go by like a speeding bullet. The words rush by while your ears and brain try to grab hold of something, anything, that sounds familiar. Some languages, like Spanish and Japanese seem faster than others, like English and Mandarin. But those who work closely with video know that English films don’t speed up when translated to Spanish or slow down when translated into Mandarin.
To investigate this puzzle, researchers from the Université de Lyon recruited 59 male and female volunteers who were native speakers of one of seven common languages: English, French, German, Italian, Japanese, Mandarin and Spanish, and one not so common one: Vietnamese. They were told to read 20 different texts in their native languages into a recorder. The only edits were to remove long silences.
Next the researchers counted all of the syllables in each of the recordings and further analyzed how much meaning (information density) was packed into each of those syllables. More number crunching followed and the researchers had two critical values for each language: the information density for each of its syllables and the average number of syllables spoken in ordinary speech. Vietnamese was
used as a reference language for the other seven with its syllables given an arbitrary value of 1.
The data revealed that the more data-dense the average syllable was, the fewer of those syllables had to be spoken per second and therefore the slower the speech. Despite those differences, at the end of a given period of time, all of the languages would have conveyed more or less identical amounts of information.
This study does point to the things that unite us – like the speech generation and processing that we share. What’s the neurological cause of it? I’m sure scientists are pondering that as well. The researchers wrote, “A tradeoff is operating between a syllable-based average information density and the rate of transmission of syllables.”