Keeping track of endangered species—particularly their dwindling or growing numbers—is crucial to their conservation. But just how to count them has never been obvious or easy. Now, in a casual, over-the-fence chat, two neighbors, an astronomer and an ecologist, may have stumbled onto a solution.
The pair developed a system of drones and infrared cameras that pool techniques from both astronomy and ecology. Serge Wich, the ecologist, had been using infrared cameras to collect footage of orangutans at night, but he was getting hung up on analyzing the massive amounts of footage. To Steven Longmore, the astronomer, identifying hot glowing objects in a crowded field was routine. There was software for that.
Software can sort through different stars because they have unique patterns of spectral lines—that is, lines of radiation corresponding to the stars’ makeup. Animals, it turns out, have signature spectra, too. So, together, the two scientists developed strategies to create an algorithm that could identify animals by these signatures.
Now, they’re working on building a reference library of thousands of animals by filming different species in zoos and safari parks. The algorithm has also been trained on humans—the idea being to both count endangered species and identify poachers.
Wich and Longmore’s efforts mark a step up from the camera-equipped drones that have mostly replaced time- and money-intensive manual efforts, which are not foolproof. Yet, this new technology still has a few kinks to work out.
Here’s Joanna Klein, writing for the New York Times:
On a sunny, summer day in 2015, the team flew their drones over a farm to see if their machine-learning algorithms could locate the animals in infrared footage.
For the most part, they could.
But accuracy was compromised when drones flew too high, cows huddled together, or roads and rocks heated up in the sun. In a later test, the machines occasionally mistook hot rocks for students pretending to be poachers hiding in the bush.
Now, the team is honing their algorithm to perfect its ability to distinguish animals from features of the environment. They’re also training it to account for rain, humidity, and other atmospheric conditions that trip it up.
Though it’s a work in progress, the scientists are already working with conservationists as well as with search-and-rescue groups, which could use the technology to find people lost at sea or in bad weather. They expect to launch the fully automatic prototype in two years.
Photo credit: Endangered Wildlife Trust/LJMU