Neuroscientists have previously established that the brain’s left hemisphere dominates language interpretation, no matter if the language is tonal or atonal, spoken or written, signed or clicked. Diana Deutsch, a psychologist specializing in musical illusions, even demonstrated the moment at which speech becomes song (you can hear about her work in this episode of Radiolab ). But whistled languages blur that line; Onur Güntürkün of Germany’s Ruhr University Bochum wondered if perhaps the brain “hears” ku ş dili as music. Or, to put it more scientifically, are whistled languages less reliant on the brain’s left hemisphere?

Though whistled languages are rare, they have been around for centuries.

Güntürkün and his team tested 31 fluent Turkish whistlers by playing spoken syllables into the left and right ears at the same time (a technique called dichotic listening). Then, they performed the same experiment using whistled syllables. Here’s Nijhuis:

When he gave them spoken Turkish, the participants usually understood the syllable played through the right speaker, suggesting that the left hemisphere was processing the sound. When he switched to whistled Turkish, however, the participants understood both syllables in roughly equal measure, suggesting that both hemispheres played significant roles in the early stages of comprehension.

The study suggests that the role of the brain’s left hemisphere in language processing is not as independent as we once thought, and that the form of a language can actually influence the architecture and functioning of the brain. This information could help scientists better understand language loss due to stroke and how it might be prevented, but the study in itself highlights the rarity the fragile state of fridge languages—sadly, ku ş dili is dying out as cell phones take over.

Photo credit: Onur Güntürkün

Share this article