AICông nghệSố hóa

AI tries to understand animal ‘speech’

Next-generation AI systems are being developed to help humans better understand the sounds of the animal kingdom.

Based on WSJpurpose of using WHO to parse animal “speech” is to create specialized field service systems, such as detecting and tracking whale calls to warn ships to avoid collisions.

“It’s great to use AI, especially deep learning, to learn animal language and learn intelligence from them,” said Oren Etzioni, AI expert at the Allen Institute. “This could help preserve animals, as well as close the intellectual gap between humans and other species.”





Photo: WSJ

Photo: WSJ

Animal communication researchers are now using a branch of AI called self-supervised learning, which has recently been shown to be effective in processing human language. According to Aran Mooney, an expert from the Woods Hole Oceanographic Institute, this branch of AI is very applicable in analyzing animal calls.

Unlike conventional AI which only needs to “learn” based on a certain amount of data which is labeled classified by each field, self-monitoring learning AI can self-analyze the available data, and at the same time acquire more content elsewhere on a regular and proactive basis. . Examples of the power of this type of algorithm are language generation and processing systems GPT-3 OpenAI, with over 45 TB of text pulled from all over the Internet over the years. With this huge amount of data, GPT-3 can create near-human text, answer quiz questions, create recipes…

According to experts, AIs like GPT-3 are considered very useful for analyzing animal “speech” for two reasons. First, self-supervised learning systems do not require human-labeled data, which is expensive and time-consuming to generate. Second, the researchers didn’t know what the animals were “saying,” making it impossible to generate labeling data about animal sounds using conventional AI.

Kevin Coffey, a neuroscientist at the University of Washington, is now studying mouse sounds. He discovered species capable of creating surprising levels of call complexity and believed they could carry information. However, specifically, what is the “feel” for this information, he does not understand. That’s why Coffey created “DeepSqueak” – software that automatically records, labels and classifies animal calls. Thanks to this software, it begins to feel the emotions of the mouse being happy or bored to some extent.

Whale Safe, a non-profit project of the Benioff Ocean Initiative, deploys a car-sized buoy system offshore. On top of that, the AI ​​system is tasked with issuing a call to scare whales every time a ship passes, as well as collecting those calls to classify and understand their language.

“In this case, instead of receiving a human voice like Alexa or Siri, the system will listen for brrrrrrr, gmmmm, awwrrrghgh,” said a Whale Safe representative. “Like a virtual assistant, the Whale Safe buoy system has to process the many different sounds of the whales, identify the calls of each species and eliminate background noise.”

Another AI-based system, BirdNET, developed by Cornell University and Chemnitz University of Technology in Germany, can now recognize the calls of more than 3,000 bird species. This AI app is available on mobile devices, enabling more than two million regular users to find or track migratory routes of birds.

However, the scientific community recognizes that current understanding of animal language is only at the level of analyzing sounds and recognizing their habits, but cannot decode language in detail. “To understand what an animal is saying, it could take decades,” Etzioni said.

Bao Lam (based on WSJ)

You are reading the article AI tries to understand animal ‘speech’

at Blogtuan.info – Source: vnexpress.net – Read the original article here

Back to top button