AI and Other Emerging Technologies Are Expanding Conservation Studies


[CLIP: Theme music]

Rachel Feltman: For Scientific American’s Science Quickly, this is Rachel Feltman. You’re listening to the second episode of our Friday Fascination miniseries “The New Conservationists.” Today we’re heading into the field, under the sea and to the savanna, with researchers who are using artificial intelligence to change the way we understand—and protect—animals and their ecosystems.

Our guide, once again, is Ashleigh Papp, an animal scientist turned storyteller. She’s here to explain why we’re turning to machine learning to process nature’s complexity and how it’s extending the reach of what our eyes can see and our ears can hear.


On supporting science journalism

If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


[CLIP: Waves crashing on a beach]

Ashleigh Papp: After college I spent time as a field researcher in Costa Rica working with endangered sea turtles. It was a lot of hard work—basically hours of walking up and down remote beaches hoping to spot a turtle nesting in the sand.

[CLIP: Footsteps on sand]

[CLIP: “None of My Business,” by Arthur Benson]

Papp: The fundamental idea behind field research rests on two questions: How many animals are there, and where do they live? If we know the answers, we can learn a lot about a species—or in some cases an entire ecosystem. And even though they’re simple questions in theory, answering them can be incredibly time-consuming and expensive.

Matthew McKown: The natural world is super complicated.

Papp: That’s Matthew McKown. He started a company called Conservation Metrics, which uses technology such as AI automation to decode nature.

McKown: There are a number of factors—there’s endless numbers of factors that influence the way that animals are behaving, the fluctuations in population, et cetera. So statistically, natural systems have a lot of statistical noise, a lot of unexplained factors that are contributing to what you see any given day.

Papp: Matthew struggled for years to find the signal hidden in that noise until he realized that AI could help, basically by greatly expanding the kind of observational work that I used to do by hand in Costa Rica. He needed a digital surveillance network that could watch and listen for sea turtles or tree frogs or parrotfish—really whatever species you’re interested in conserving.

McKown: What technology allows us to do is really increase the scale of our observation—so increase the number of places that you can be watching or measuring aspects of animal behavior and the amount of time you can do it. As opposed to just having one person in one place in a day, you can have 50 sites that are monitored for the full year. And that really helps us get a better understanding of the actual, true signal of what’s happening in these communities.

Papp: Think: cameras mounted on a tree in the jungle to capture monkeys swinging through …

[CLIP: Tree branches and leaves rustling]

Papp: … or a hydrophone dropped into the water to record audio of whales swimming by.

[CLIP: Whales vocalizing]

Papp: But all those new observations, in turn, create a new problem.

McKown: Then you start to generate huge amounts of information, huge—many tens of thousands of hours of recordings.

Papp: So instead of having one carefully curated notebook of field observations, you now have terabytes of passively recorded video and audio files.

McKown: It becomes impossible to try and actually find the things you’re interested in amongst the thousands of hours of recordings.

Papp: Enter machine learning.

McKown: If you have 80,000 hours’ worth of data and the thing that you’re looking for is only in there for half an hour, computers can really help reduce the time it takes to find those things you’re interested in.

[CLIP: “It Doesn’t End Here (Instrumental),” by Nehemiah Pratt]

Papp: So Matthew got into building and refining computer software that cuts through the noise and pulls out the interesting stuff—because then the science can happen.

His company started off monitoring seabirds, a notoriously difficult group of animals to study. From there Conservation Metrics branched out into songbirds, bats and insects, and more recently the team took the technology for a swim.

McKown: We’re super interested in coral reef communities.

Papp: Coral reefs are a powerhouse under the sea; about 25 percent of all marine species are found in, on or around reefs, and the biodiversity of these ecosystems rivals rainforests on land. It’s estimated that one billion people rely on reefs for food, income, and protection. And those reefs are in danger. Climate change, unsustainable fishing and pollution are their top threats, which can cause one of the reefs’ main energy producers, tiny algae called zooxanthellae, to flee in search of better living conditions.

McKown: There’s a lot of really interesting science going on around the world at a bunch of university labs that have shown that healthy reefs have a unique sound and that that sound disappears as reefs get degraded.

Papp: For those of us who haven’t had the chance to snorkel or scuba dive near a reef, I had to ask Matthew: What does a healthy coral reef sound like?

McKown: Coral reefs sounds [laughs] are really a little alien; there’s a lot of popping and clicking on a lot of tropical reefs, which is related to several species of shrimp that make clicks with their appendages.

[CLIP: Popping and clicking noises of shrimp]

McKown: And then there’s a lot of grunting [laughs] with these fish—they’re doing a lot of grunting. And, you know, it sounds kind of like off-gassing sometimes and like [makes noise], like a bubble coming up to the surface.

[CLIP: Fish grunting]

Papp: It turns out all these pops, clicks, grunts and bloops convey a lot of information. Right now the Conservation Metrics team is working with a few universities in the U.S. and a coral restoration group in Mooréa, an island in French Polynesia that’s not far from Tahiti, to translate the noises.

The field researchers on the island drop hydrophones into the water, press record and then ship the sound files off for processing.

McKown: We’re building these detection and classification models, these things that computers use to strip through the audio recordings and find the things that you’re interested in. And it’s crazy because we don’t know what’s making sounds, so we just have all these sort of unknown sound categories [laughs]. We just give them a generic name. And then we just track that sound, even though we don’t know what’s producing it.

Papp: That decoded and organized soundscape is providing valuable new information for coral research in Mooréa, which has been ongoing for decades. Researchers are combining recorded audio and video with over 20 years’ worth of observational data—the physical notes of what a researcher sees when they swim in the same area every day.

McKown: We’re really interested in combining across ways of making observations on reefs. So there’s traditional scuba divers, where they swim a line and they count all the fish they see. They’ve been doing that for a long time. And at the same locations we’re now going to start to add an acoustic sensor. And then we’re adding a video camera with the computer vision side of that.

And all of these ways are different ways of, you know, trying to make observations about how a community is doing. And so we’re really interested in: How do you look across these lines of observation to learn more? Because each of these methods has its biases, has its downsides, has its blind spots. And they cover different areas. They cover different time periods. They cover different taxa, different species.

Papp: Matthew hopes that better understanding the soundscape in places like Mooréa will also help us learn how the noises of a healthy reef impact the animals that live there.

McKown: Animals are making decisions based on what the soundscape is. So there’s a lot of experimental evidence that shows that certain species’ larva are floating around the oceans and they will drop out of suspension when they hear a healthy reef or a healthy oyster bed, et cetera. Like, they’re making decisions based on the sound field.

[CLIP: Popping and clicking noises of shrimp]

Papp: So some researchers are using AI to better understand the soundscapes of ecosystems. But what about using AI to literally see how ecosystems work?

Tanya Berger-Wolf: There are millions and millions of images out there of animals.

Papp: That’s Tanya Berger-Wolf, a computational ecologist. She’s a professor at the Ohio State University and leads the Imageomics Institute.

Berger-Wolf: So bringing all these images together automatically with the modern computer vision and machine-learning approaches, we can find all the ones that contain animals, find where the animals are in those pictures, put a bounding box around each one and identify not only species, but down to individual animal.

[CLIP: “Let There Be Rain,” by Silver Maple]

Papp: Consulting firm Rise Above Research estimates that almost two trillion photos will be snapped this year. And for many people sharing photos of wildlife and outdoor adventures is a popular pastime. (Anything for the ’gram, right?) Well, it turns out these social media posts can be useful for animal conservation science.

Berger-Wolf: Anything striped, spotted, wrinkled, notched, even using the shape of a whale’s fluke or the dorsal fin of a dolphin—these are all unique identifiers. And so then, with information on when and where the image was taken, you can really now, finally, start using images as the source of all kinds of information about animals: tracking them, counting them, determining their range and even, yes, their social network.

Papp: Tanya spent years studying mostly mathematics. Only when she met her now-husband, ecologist Mosheh Wolf, did she start considering how math could help with conservation work.

Berger-Wolf: And for many years, while doing my very theoretical computer science Ph.D.—still pretty much math—I was involved in many conversations with him and his friends and colleagues where I would walk away with the feeling, “Oh, there’s gotta be a better way of answering that question,” which was an ecological question.

Papp: She chose a postdoc position in ecology and evolutionary biology and eventually started working with a group studying the social behavior of zebras in Africa. Zebras are a super social animal, and they often hang out in big groups called a zeal or a dazzle.

To understand each individual animal’s behavior, the group needed to be able to quickly identify one zebra from another amid a dazzle of black and white stripes. Tanya, at that time, had been working from behind a computer screen, wanting to bring a computational and algorithmic approach to the work. But after a few years on the team—and a lot of prodding from her colleagues—she went to go see the animals herself and had a big realization …

Berger-Wolf: I finally went and saw my data. And one of the things that immediately became very clear [was] that all the assumptions that I was making about my algorithms and the way I approached the problem were completely off. It also became very, very clear that I did not understand my data.

Papp: To generate the data, one of Tanya’s colleagues would go out every day and take photos of the zebras. Then the colleague would use a computer program to very carefully match, pixel by pixel, the zebra’s stripes to recognize each animal and note who was standing next to whom at what time of day and any other particulars about the animals’ behavior.

But Tanya, seeing the zebras in the wild and then watching the very manual process of matching the stripes, thought to herself …

Berger-Wolf: This is nuts. Five minutes later, I’m like, “This is insane. This is taking too long. It’s gotta be two clicks; come on!”

[CLIP: “Lead,” by Farrell Wooten]

Papp: So she posed a friendly bet to her colleagues, which they gladly accepted. She then went back to her graduate students and explained their task.

Berger-Wolf: I’m like, “Look, I just bet my reputation that we can recognize individual zebras from photographs in two clicks.”

Papp: And she and her students did it. They developed a computer program that could recognize a zebra by its stripes in two quick steps. And soon enough the effort picked up steam and the algorithm was expanded to other species, allowing valuable information—like spots on a giraffe or notches on a shark’s fin—to be extracted from images. The program could be used to do everything from determining population size all the way down to tracking individual animals.

Today the company that Tanya started with this effort—originally called Wildbook, which later became a part of Wild Me and now, Conservation X Labs—has more than 50 species in its databases. Images come from researchers, autonomous vehicles, camera traps and even tourists, and the list of contributors continues to grow each year.

For computational ecologists, however, this project is just the beginning.

Berger-Wolf: If I show you pairs of photographs of zebras and ask you, “Are these two more similar to each other than these two?” No way, and no amount of training will help you.

But the same algorithm that identifies the zebras also quantifies the similarity between stripe patterns and allows us, for the first time ever, to compare the stripe similarity to genetic similarity and to start understanding the mechanism behind the stripe-pattern development. Is it hereditary? Can zebras tell each other apart using stripes, or do they not use it at all as a function?

Papp: And those questions, now more easily answered thanks to machine learning and AI, can help expedite conservation work around the world.

[CLIP: Rainforest sounds]

Berger-Wolf: Biodiversity has a data problem. More than 10 percent of the world’s [eukaryotic] species are threatened with extinction. That is shockingly large number. But the problem is that we really don’t know exactly what we’re losing and how fast.

Papp: Today’s climate is changing because of us. Our technology has emitted a lot of the greenhouse gases that are warming the planet. But animals are paying the price. Around the world their habitats are changing too quickly for them to adapt, and species are disappearing altogether.

And by the way, every time we ask ChatGPT to answer a question using AI, it requires about 2.9 watt-hours of electricity. By comparison, an incandescent lightbulb uses about 60 watts in an hour.

While ChatGPT’s electricity usage may not sound like a lot, keep in mind how many people are using AI every second of every day. This means AI is also contributing to the problems that animals and our environment face. So if we’re going to keep using AI, we should employ it to help find some of the solutions, too, right?

[CLIP: “Anchor (Instrumental),” by Stephanie Schneiderman]

Papp: We’ve been looking at and listening to animals since time immemorial. But now, finally, we’re harnessing some of our ecosystem-disrupting technology to figure out what creatures are left in the natural world and how they’re coping. Equipped with this information we can make more informed conservation decisions.

Animals face more challenges than ever before. But with the help of technology—and perhaps a whole lot of wildlife selfies—they might just stand a chance.

Feltman: That’s it for today’s show. Join us again next time, when we’ll meet two conservationists who don’t fit the historic mold for who does this work. (Spoiler alert: conservation has a diversity problem.)

[CLIP: Theme music]

Feltman: Science Quickly is produced by me, Rachel Feltman, along with Fonda Mwangi, Kelso Harper, Madison Goldberg and Jeff DelViscio. This episode was reported and co-hosted by Ashleigh Papp. Shayna Posses and Aaron Shattuck fact-check our show. Our theme music was composed by Dominic Smith. Subscribe to Scientific American for more up-to-date and in-depth science news.

For Scientific American, this is Rachel Feltman. See you next time!



Source link

About The Author

Scroll to Top