Dear Deeply Readers,

Welcome to the archives of Oceans Deeply. While we paused regular publication of the site on September 1, 2018, we are happy to serve as an ongoing public resource on ocean health and economy. We hope you’ll enjoy the reporting and analysis that was produced by our dedicated community of editors and contributors.

We continue to produce events and special projects while we explore where the on-site journalism goes next. If you’d like to reach us with feedback or ideas for collaboration you can do so at [email protected].

Soft Machine: The Robot That Fools Fish Into Believing It’s One of Them

MIT scientists have developed a soft-skinned robotic fish that can study marine life up close and personal in coral reefs and other ecosystems without altering their behavior.

Written by Matthew O. Berger Published on Read time Approx. 4 minutes
Using its undulating tail and a unique ability to control its own buoyancy, the SoFi can swim in a straight line, turn or dive up or down.Photo Courtesy of MIT CSAIL

As researchers develop increasingly sophisticated robotic devices to study regions of the ocean that are farther, deeper and darker, one advance has remained out of reach: invisibility.

A new tool, however, may come the closest yet to being able to blend in with the marine environment it is meant to study, inconspicuously gathering data on ocean animals without disrupting their behavior.

Called the SoFi, the soft, white, fish-like robot is 1.5ft (0.5m) long and sports a tail, fins and a camera embedded in its snout. Over several days of testing at Rainbow Reef off Fiji, the SoFi – developed by MIT’s Computer Science and Artificial Intelligence Laboratory – was able to swim within a meter of fish without causing them to flee. Several fish even swam within a few centimeters of the robot without changing their trajectories or reacting to the fake fish.

The findings, published Wednesday in the journal Science Robotics, could mark a breakthrough in underwater robotics. Autonomous gliders, remote-operated drones, wave-powered floats and other devices have proliferated in recent years as scientists have taken advantage of new technologies to lower research costs and reach understudied areas. But those devices are still often too big or too loud to closely observe coral reefs and other ecosystems unobtrusively.

MIT CSAIL

“There’s been a lot of technological progress in the world of nature filmmaking, but it can still be very hard to document sea animals up close without disturbing them,” said Robert Katzschmann, a PhD candidate at CSAIL and lead author of the study, describing the new robot. “We wondered if it might be possible to develop a robot that was small, nimble and lightweight enough to swim in the ocean autonomously alongside marine animals.”

Their creation, the SoFi, propels itself by the undulation of its flexible fishlike tail. This lifelike movement gives the robot “the potential to be a new platform for studying and interacting with underwater species,” the authors wrote.

The fins and an internal buoyancy controller allow the SoFi – which is powered by a smartphone battery – to swim in three dimensions. To turn the robot, a motor pumps water into two chambers built into either side of its tail. When water flows into one chamber, the tail flexes and the SoFi turns in one direction; when actuators send water to the other chamber, the robot moves in the opposite direction.

The SoFi can swim for about 40 minutes and is operated remotely by a human diver, who uses a waterproof Super Nintendo controller to communicate with the robot via acoustic signals. For now, the robot can operate in depths up to 60ft (18m) and the operator must be within 70ft of it.

The team used a waterproof Super Nintendo controller to change SoFi’s speed and have it make specific moves and turns. (Photo Courtesy of MIT CSAIL)

But Katzschmann said there are already plans to build the next-generation SoFi. He would like the camera to lock onto and automatically follow particular fish and wants to improve the design and pump system to enable the robo-fish to swim faster. He also intends to test whether having multiple SoFis in the water affects fish behavior and whether this would allow the study of schools of fish over bigger areas.

So far, SoFi’s creators haven’t heard from researchers interested in deploying the new device. “We’re excited to hopefully work with biologists in the future to see how this might be able to help them conduct their research by gathering water samples or other data that might be hard to get otherwise,” said Katzschmann

The SoFi is not the first robotic fish to be developed. MIT and other institutions have previously built robotic sharks, rays, snapper and tuna. A zebrafish robot created in 2015 was able to autonomously interact with schools of real fish. At the University of Washington, engineering professor Kristi Morgansen has developed three robo-fish.

Morgansen, who wasn’t involved in the new study and had seen only a synopsis, said it looked like the SoFi robot was bringing together elements of robotics that “have been used in various settings for a while,” but that the big advance is in using soft robotics to achieve fishlike locomotion.

SoFi’s lightweight setup includes a single camera, a motor and the same lithium polymer battery that can be found in consumer smartphones. (Image courtesy of MIT CSAIL)

“There’s definitely a lot of interest in using soft robotics right now,” she added. “If the goal is to study animals underwater … then you want something that moves as close to how they move as possible.”

A colleague of Morgansen’s studies penguins, she said, and tries to observe them by putting a diver with a camera in the water. “But penguins are very social, so they start playing with the diver,” she noted. For useful observations, a diver (or robot) would need to swim along undetected, watching how animals behave when they don’t know they’re being watched.

Suggest your story or issue.

Send

Share Your Story.

Have a story idea? Interested in adding your voice to our growing community?

Learn more