TOPICS > Science

Human echolocators ‘see’ with sound. Here’s what that actually looks like.

BY   September 8, 2017 at 5:07 PM EDT
Illustration of the acoustic pattern of mouth clicks for human echolocation created by a computer model. Credit: Thaler et al. PLOS Computational Biology, 2017.

Illustration of the acoustic pattern of mouth clicks for human echolocation created by a computer model. Credit: Thaler et al. PLOS Computational Biology, 2017.

Human echolocators “see” their worlds through sound, and thanks to a new computer model, you can too.

Using math and painstaking experiments, a group from the UK modeled how the sounds from the nuanced mouth clicks of human echolocators travel around a room. Their findings, reported in PLOS Computational Biology, could yield devices that make “smart” radar maps that reveal the physical features of objects in the environment, like their texture or hardness, based off sounds.

“One motivation was basic curiosity,” said Lore Thaler, a neuroscientist at Durham University. “We know very little about the mouth clicks people make when they echolocate. So we just wanted to know what they are like.”

Human echolocators, like bats, make clicking noises to create sound wave reflections and map their surroundings. You might think this talent requires a superhuman ear, but Thaler said most people, unbeknownst to them, dabble in echolocation all the time. Ambient sounds hint at the type of room you are in: a hard, cavernous gym with bleachers reverberates differently than a cozy bedroom with soft furnishings.

Sign up to get our Science email

We'll explore the wide worlds of science, health and technology with content from our science squad and other places we're finding news.

To decipher how these mouth clicks paint a room, the team recruited three blind, expert echolocators, who can deftly navigate in unfamiliar places, like while hiking or riding bikes. With recording equipment at hand, these volunteers stood in the exact same spot in a soundproof room and clicked repeatedly. Over multiple sessions, the researchers moved the microphones around the room, recording the clicks from all angles.

Two years later, Thaler and her team had gathered enough recordings to make a 3D-model that visualizes how the sound moves through the surrounding space.

The mouth clicks did not behave like regular speech. The staccato sounds moved mostly in one direction, rather than spreading in all directions like rippling waves. This focused pattern appears to strengthen the echolocators ability to locate where things are.

“Now we can create a virtual environment, like a visual computer game, but with acoustics,” Thaler said. These computer models could reduce the need for recruiting volunteers in future work, which is typically the toughest part of research on human echolocation, she said.

Some scientists are eager to get their hands on this model, such as Daniel Rowan, audiology researcher at University of Southampton. Rowan studies hearing impairment and the importance of hearing in blind people. He uses sound in his experiments, such as a recent study on how listening can determine the quality of an object, like its shape or hardness.

But he often relies on synthetic sounds, instead of human-made sounds, which he said are harder to trust. For example, the clicks in Thaler’s model show that the sound waves from human echolocation clicks are a lot shorter than Rowan previously thought. Thaler’s model was able to generate synthetic echolocator clicks that bore a high resemblance to what humans produce.

An echolocation click performed by a human volunteer.

A computer generated click, based on Thaler’s model. The difference is barely audible.

For others, this model is just a start. “With only three volunteers, one doesn’t know how general this model is,” said Bo Schenkman, experimental psychologist at the KTH Royal Institute of Technology in Stockholm. Schenkman added that longer sounds, such as shhhhhhh, are also useful for echolocation and need to be included in the model.

This research could also be put to everyday use at airports or on ships.

Radar shares a lot in common with human echolocators by emitting a pulse of radio waves and building a picture from the echo. Galen Reich, an electrical engineer at the Microwave Integrated Systems Laboratory of University of Birmingham who worked with Thaler, was also surprised by the length and focused direction of the clicks. Unlike current ship’s radars, which have the single purpose of determining the distance to surrounding objects, systems based off human echolocation could identify texture and movement.

But Thaler wants to know as much as she can about how human echolocation works because of the fact that people can be trained to do it. She has even gone as far as to practice it herself.

“I’m actually not bad,” she said, estimating it would take 10 weeks while blindfolded to really master the ability. “I’ve discussed this with my family and they aren’t too keen.”

Until then, her computer models should serve her quest to unravel a person’s capacity to see with sound, she said.

SHARE VIA TEXT