For about 10,000 residents of a small village in northeast Turkey, speaking is more like singing.
The locals call it ku ş dili , or “bird language,” and it sounds a bit like birdsong . This ancient, melodious mode of communication is at the center of a new study published in Current Biology that could fundamentally alter what we know about language.
Here’s Michelle Nijhuis,
Whistled languages, although unusual, have been around for centuries. Herodotus described communities in Ethiopia whose residents “spoke like bats,” and reports of the whistled language that is still used in the Canary Islands date back more than six hundred years. Most of the forty-two examples that have been documented in recent times arose in places with steep terrain or dense forests—the Atlas Mountains, in northwestern Africa; the highlands of northern Laos, the Brazilian Amazon—where it might otherwise be hard to communicate at a distance. All are based on spoken languages: Kuşköy’s version, for example, adapts standard Turkish syllables into piercing tones that can be heard from more than half a mile away. The phrase “Do you have fresh bread?,” which in Turkish is “ Taze ekmek var mı? ,” becomes, in bird language, six separate whistles made with the tongue, teeth, and fingers.
Neuroscientists have previously established that the brain’s left hemisphere dominates language interpretation, no matter if the language is tonal or atonal, spoken or written, signed or clicked. Diana Deutsch, a psychologist specializing in musical illusions, even demonstrated the moment at which speech becomes song (you can hear about her work in this episode of Radiolab ). But whistled languages blur that line; Onur Güntürkün of Germany’s Ruhr University Bochum wondered if perhaps the brain “hears” ku ş dili as music. Or, to put it more scientifically, are whistled languages less reliant on the brain’s left hemisphere?

Güntürkün and his team tested 31 fluent Turkish whistlers by playing spoken syllables into the left and right ears at the same time (a technique called dichotic listening). Then, they performed the same experiment using whistled syllables. Here’s Nijhuis:
When he gave them spoken Turkish, the participants usually understood the syllable played through the right speaker, suggesting that the left hemisphere was processing the sound. When he switched to whistled Turkish, however, the participants understood both syllables in roughly equal measure, suggesting that both hemispheres played significant roles in the early stages of comprehension.
The study suggests that the role of the brain’s left hemisphere in language processing is not as independent as we once thought, and that the form of a language can actually influence the architecture and functioning of the brain. This information could help scientists better understand language loss due to stroke and how it might be prevented, but the study in itself highlights the rarity the fragile state of fridge languages—sadly, ku ş dili is dying out as cell phones take over.