New computer vision model marks the first time research and conservation have come together at AI2.
Whether it is happening just off the coast or in some of the most remote areas of our planet, the illegal, unreported, and unregulated (IUU) fishing crisis — a major driver in our ocean’s decline — is often out of sight. Operating undetected underneath the cover of public monitoring systems, these “dark vessels” are hiding the illicit activities that are pillaging our seas or worse. This is the crux of the IUU fishing crisis — to address and reduce it, the activity must first be exposed. Now, researchers, scientists, and maritime experts at the Allen Institute for AI (AI2) are leveraging advanced state-of-the-art technology to shine a light on our seas for all to see.
As part of the xView3 challenge, members across AI2 came together to harness satellite-based synthetic aperture radar (SAR) data and combine it with AI to identify vessels suspected of engaging in IUU fishing — the aim of the competition. The result of the team’s work was a cutting-edge computer vision algorithm that is able to automatically detect and characterize these “dark vessels.” Out of nearly 2,000 applicants across 67 countries who participated in the Defense Innovation Unit (DIU) and Global Fishing Watch (GFW) sponsored competition, the team’s algorithm finished fourth and first in the U.S. But this wasn’t the only thing noteworthy about the challenge.
“What made the xView3 competition so exciting was that it marked the first time computer vision research and conservation were brought together at AI2,” said Ani Kembhavi, who leads the Perceptual Reasoning and Interaction Research (PRIOR) group at AI2. “Over the past few years, we’ve seen a growing number of products including many consumer applications benefiting from AI technology such as computer vision. Our aim is to also harness this power for common good efforts in a wide variety of areas including conservation and climate change.”
The team will now integrate the computer vision model built for the challenge into the Skylight product. The Beta version will be out in the spring and is set to give governments, agencies, nonprofits, and academia new insights into “dark” fishing activity. Unlike automatic identification system (AIS) data — a system similar to how air traffic controllers track planes — SAR imagery data doesn’t rely on ships to transmit their positions, and unlike optical imagery is able to locate vessels through clouds, day or night. With the potential of SAR coupled with the advanced computer vision algorithm developed for the challenge, important gaps in a vessels’ activity can be filled.
“The model developed by the computer vision researchers at AI2 in partnership with AI2’s Skylight team will be a game changer in the value Skylight offers to countries using the tool to understand IUU fishing activity in their waters,” said Skylight’s director, Ted Schmitt. “This is just the first example of how Skylight and the fight against IUU fishing will benefit from the world-class AI expertise at AI2.”
The potential for AI to advance the fight against the IUU fishing crisis has never been higher. In fact, without AI, leveraging revolutionary technologies like SAR satellite imagery to find something worthwhile for a maritime analyst would’ve been worse than looking for a needle in a haystack. Today, satellites like the one used to collect SAR data for the xView competition can image 7 million square miles per day. But in human terms, that means about 800 person-hours or four weeks just to dissect one day’s worth of SAR data. The hours and technical skills required to manually ingest SAR data and assure the quality of detections is another blocker that keeps this valuable information out of reach for maritime analysts, particularly those in developing countries. With computer vision algorithms like the one AI2’s team developed, time spent analyzing a day of data was cut down to just 8 hours on a GPU — and Skylight aims to automate the time- and skill-intensive data ingestion process as well. However, the use of SAR imagery isn’t infallible.
“SAR imagery lacks the richness of optical imagery that people are generally more familiar with from the ‘Satellite View’ of map platforms,” said Favyen Bastani, an applied research scientist on the PRIOR team. “In SAR data, the radar signatures of rocks, islands, and large off-shore buoys are often virtually identical to those of ships, making robust vessel detection especially challenging.”
For AI2, the xView3 challenge is a launch pad. Applying AI to develop the tech for good tools needed to solve some of the most pressing conservation issues is new for the institute. While it may be an unfamiliar realm, building AI for the common good is something AI2 has always and will always do.