skip to Main Content
What-are-animals-saying?-ai-may-help-decode-their-languages-–-national-geographic

What are animals saying? AI may help decode their languages – National Geographic

In May 2020, Pratyusha Sharma was painstakingly parsing data to prepare for a meeting with her research group at the Massachusetts Institute of Technology, hoping to find a pattern. It wasn’t just any dataset—this one had proved indecipherable. It contained hours, collected over a decade, of the creaking, rattling clicks that sperm whales use to communicate.

Sharma is part of Project CETI—Cetacean Translation Initiative—an effort to understand, in the group’s words, “what whales are saying.” The goal is straightforward but incredibly difficult: CETI wants to translate nonhuman communication. So Sharma and the rest of the team had enlisted their own nonhuman ally, a rudimentary artificial intelligence program, to help organize the audible clues in new ways for more inspiration. 

CETI hopes that if humans can understand what’s going on inside whales’ minds—and, eventually, the minds of other animals—it might inspire greater conservation strides. The group has good reason to believe it could. In the 1960s, when researchers discovered that humpback whales sing to each other, for instance, their work led to the wildly successful Save the Whales campaign and the Marine Mammal Protection Act. Gravely threatened for decades, humpback and gray whale populations started to recover.   

CETI began serendipitously seven years ago at Harvard University’s Radcliffe Institute. David Gruber, a marine biologist and National Geographic Explorer, was in his office listening to recordings of sperm whales. Computer scientist Shafi Goldwasser happened to walk by, heard the curious clicking sounds, and came into Gruber’s office to ask what they were. The answer provided a new kind of motivation. A number of Goldwasser’s colleagues were already using machine learning to try to improve human language translations. Why not apply a similar approach to whales?

The hump of a whale is seen coming out of the ocean water. Above it, is a piece of technology that acts as a tagging device. This will attach a tag to a whale to collect audio.

Since 2020, Project CETI has worked to understand nonhuman speech, starting with sperm whales.In Dominica, researchers use drones to tag whales and collect audio, which is then cataloged by AI.

Photograph by Jaime Rojo

A click, says Gruber, is almost like a one and a zero—or binary code, the most basic computer language. “If you’re trying to analyze a 20-minute humpback whale song that’s very spectrally complex, clicks are very nice.” They are easy for an AI model to ingest and, hopefully, analyze. 

(Groundbreaking effort launched to decode whale language.)

Gruber contacted Shane Gero, another National Geographic Explorer, who had spent 13 years studying sperm whales in the Caribbean island nation of Dominica. “He had the most comprehensive database of whale sounds,” Gruber says. They did a pilot study to see if they could design a custom machine learning program capable of spotting familiar sequences within these interactions, much as humans have recognizable conversational patterns. The result was a new kind of processing model able to anticipate what some whales might communicate next. “It was getting, like, 99 percent accuracy, predicting the next click,” says Gruber.

YEAR-LONG ADVENTURE for every young explorer on your list

FREE limited-edition frog drawstring bag with every Nat Geo Kids Book Bundle subscription

The team then applied that model to Gero and his group’s ongoing study in Dominica, where scientists were tagging whales and recording their sounds and movements. When pods came to the surface, drones monitored their behavior. The data went back to the lab, in hope of associating distinct groups of clicks called codas with obvious behaviors, which could be evidence of whales actively listening to each other and responding with action.

By 2020, Gruber’s initial group had expanded to nearly 20 scientists spanning institutions from Harvard and MIT to Oxford and UC Berkeley and representing disparate disciplines: machine learning experts, marine biologists, a cryptographer, and, crucially, a linguist. Using AI, they discovered and cataloged thousands of different codas, drawn from thousands of hours of recordings.

Sharma was in her second semester of grad school at MIT, studying computer science and artificial intelligence, when she joined the project. She had an idea. Since the project began, scientists had represented individual codas the same way: as a series of dashes that stood for how many clicks a whale sounded per second. With her MIT colleagues’ help, Sharma turned the audio data into a new kind of visualization—one that, instead of bars on a horizontal axis, looked more like orchestral sheet music, illustrating multiple codas. Parallel lines of cascading dots presented vocalizations side by side. 

This approach revealed subtle differences in the cadence of each coda, where the time between clicks slowed or quickened. Borrowing from classical music, the researchers dubbed these rubatos. The system also revealed instances where whales added an extra click to the end of a coda. This “ornamentation,” as the researchers called it, seemed to carry meaning. 

Subtle variations in rhythm, tempo, ornamentation, and rubatos reminded Sharma and her colleagues of phonemes, the fragments of sound that humans combine and recombine into words. It’s possible these codas are the basis of a complex language. Most of these nuances had not been distinguishable until now.

Before joining CETI, Sharma had considered earning a Ph.D. in robotics. She had never studied animals. She had never even seen a whale. “One of the cool things about Project CETI,” says Jacob Andreas, a natural language processing expert at MIT and one of the project’s researchers, “was getting a bunch of people who really think of themselves as computer scientists actually involved in this project of understanding animal communication.”

This kind of work isn’t limited to whales. Across the natural world, scientists and researchers are increasingly turning to artificial intelligence for help understanding the interior lives of animals, as well as the habitats that sustain them—oceans, forests, even commercial farms. Still mysterious in many ways, AI is already enabling very human connection with other living things—and, perhaps, a new way of thinking about the planet’s future. 

(How artificial intelligence is changing wildlife research.)

Two men are looking at objects in a lab.

Since 2020, Project CETI has brought together experts from varying disciplines and institutions, like these researchers at Harvard Science and Engineering Complex, to analyze groups of whale clicks called codas.

Photograph by Spencer Lowell

Suresh Neethirajan works at the cutting edge of another kind of computer-enabled animal interaction. A professor of computer science and agriculture at Dalhousie University in Nova Scotia, Canada, he studies how farmers can use real-time monitoring to interpret what different behaviors really mean.

Neethirajan grew up with livestock on a dairy farm in south India. His parents considered their cows partners in the endeavor—the humans were in charge, but they didn’t have a livelihood without the creatures and their milk. And so, when the cows stopped producing, they weren’t sent to slaughterhouses. They lived out their days on the farm as a thank-you for their service. It’s a “social-economic belief system,” Neethirajan says. 

Neethirajan, who doesn’t eat meat, began studying the inner lives of farm animals about a decade ago—chickens, cows, horses, sheep, and pigs. As a “classically trained agricultural engineer and partially trained animal scientist,” he says, he wondered how he might use technology to improve their quality of life. 

First, he had to collect data. He monitored body temperature, cortisol levels, hormones, and respiration and heart rates with biosensors as well as blood, stool, and hair samples. Then he paired that with audio and video footage, and added context, like an animal receiving food (positive) or hearing a startling noise (negative). The goal: to understand what it looks like when an animal is comfortable or unwell. 

Seven years ago, Neethirajan began processing his data with AI, including a deep learning model that performs facial recognition and gait analysis in livestock. Like Sharma and the MIT team, he uses natural language processing tools to understand animal vocalization. His analysis can pinpoint the specific squawk that chickens make before they leave a room. Now, he explains, video footage of a barn full of 5,000 chickens can be put into the model, and within a few minutes it can identify the five birds most likely to be sick. 

You May Also Like

(How AI is helping scientists protect birds.)

Neethirajan’s work is an argument for why animal welfare is valuable even in an industry that doesn’t always prioritize the well-being of its living product. Identifying disease early prevents both suffering and financial losses. And there is research that shows ​​a happier animal is a more productive animal; cows that live in a positive environment give more and better milk. Farms can act on this information. “They are thinking beings,” Neethirajan says of animals. “They have their own likes and dislikes.” He found chickens make fewer noises of distress when their habitats are cleaned with greater frequency, because they can breathe more easily—so why not increase the cleaning schedule? 

“As human beings,” he says, “we exist with the plant kingdom, animal kingdom, and [other] humans, and our population is growing enormously … How do we peacefully coexist? How do we create harmony?” For this new band of AI-assisted researchers, these aren’t just existential questions anymore, which has led to an even grander experiment.

Jörg Müller heads conservation at the Bavarian Forest National Park, Germany’s oldest national park, and teaches forest ecology at the University of Würzburg. But his research takes him halfway around the world, to South America, where he’s developing a new kind of AI-enhanced “stethoscope” for monitoring tropical ecosystems that were cleared for agriculture. It’s relatively easy to survey the regrowth of forest canopy with satellites and remote sensing. It’s much harder to know how long it takes to recover native biodiversity—the flourishing creatures and plants living below the canopy. Müller works with statisticians, entomologists, ornithologists, and local communities in Ecuador to understand what signals he can trace that may be evidence of revitalization efforts working. 

(How artificial intelligence can tackle climate change.)

In 2021, Müller contacted Zuzana Buřivalová, an assistant professor at the University of Wisconsin-Madison, whom he calls a “young rising star in sound ecology.” She and her colleagues had developed a method of using bioacoustics to estimate how many different species were living in the woods, based on the noises they made.

Müller and his team applied Buřivalová’s approach by placing recording devices at dozens of sites around an area of about 50,000 acres of Ecuador’s Chocó forest. And for two straight weeks, they gathered data: nearly 2,000 hours of soundscape. 

Birds are among the best indicators of a tropical ecosystem’s overall vitality; if they have recovered, according to Müller, then other species up and down the food chain, from jaguars to insects, have too. So Müller got bird experts Juan Freile, the author of Birds of Ecuador, and Rudy Gelis to identify bird vocalizations from the audio; they documented more than 300 species. He then ran the audio against an existing AI model that had been trained to recognize 75 species, all of which it “heard” in the soundscape. It could also approximate their abundance. Müller found these results promising because they demonstrated that a fully trained AI could be as effective as human experts—and much quicker—and help scientists monitor progress of forest recovery. Based on expert and AI identification, Müller predicts it takes about 55 years for rainforest land once cleared for farming to recover its native biodiversity.

(Can artificial intelligence save one of the world’s most beautiful lakes?)

Project CETI is at a different kind of inflection point, with researchers still in the information gathering stage. Scientists know what whales look and sound like as they communicate, but not what any of that means yet. “We’re like baby whales beginning to learn,” says Gruber, “one little piece at a time.”

Though there could be some parallels to human language, it doesn’t mean the structure of whale speech mirrors ours: We have completely different ecological and evolutionary needs. Perhaps, Andreas explains, whales are simply saying something like, Hey, could you scoot over to the left? His CETI colleagues have seen navigational patterns where a group of whales are swimming down the side of Dominica, then they suddenly turn and move to the open sea, even when they aren’t all in sight of one other.

Sharma hopes that one day we will understand how language is passed down from mother to calf. Like humans, whales are not born with language but pick it up socially. Andreas dreams of learning more about the social structure of pods, which are matrilineal. When he and Gero were out on a boat watching sperm whales a few summers ago, Gero pointed to a pair and told Andreas: “They’re the two old ladies of the clan, and they’re best friends with each other, and they hang out all the time.” Andreas wonders if whales have calls that function similarly to names for each other, as elephants and dolphins do, and whether they’re able to refer to whales that aren’t present.

There remains much to do and learn. 

Though Gruber’s greatest hope is that insight into sperm whales will lead to an improvement in their lives, he’s still struck by one moment at Harvard. Someone said: If we could translate what sperm whales are saying, we could use this kind of platform to communicate with extraterrestrials. That’s not a new idea—it appears in an old Star Trek movie.

The difference is that researchers are actually making progress now. If artificial intelligence can help us access parts of the natural world we don’t yet understand, who knows what universes it could open up someday.

Back To Top