These fish-inspired robots can synchronize their movements
without any outside control. Based on the simple production and
detection of LED light, the robotic collective exhibits complex
self-organized behaviors, including aggregation, dispersion and
circle formation. (Image courtesy of Self-organizing Systems
Research Group)Download
Image
Schools of fish exhibit complex, synchronized behaviors that
help them find food, migrate and evade predators. No one fish or
team of fish coordinates these movements nor do fish communicate
with each other about what to do next. Rather, these collective
behaviors emerge from so-called implicit coordination â
individual fish making decisions based on what they see their
neighbors doing.
This type of decentralized, autonomous self-organization and
coordination has long fascinated scientists, especially in the
field of robotics.
Now, a team of researchers at the Harvard John A. Paulson School
of Engineering and Applied Sciences (SEAS) and the Wyss Institute for Biologically
Inspired Engineering have developed fish-inspired robots that
can synchronize their movements like a real school of fish, without
any external control. It is the first time researchers have
demonstrated complex 3D collective behaviors with implicit
coordination in underwater robots.
âRobots are often deployed in areas that are inaccessible or
dangerous to humans, areas where human intervention might not even
be possible,â said Florian Berlinger, a PhD Candidate at SEAS and
Wyss and first author of the paper. âIn these situations, it
really benefits you to have a highly autonomous robot swarm that is
self-sufficient. By using implicit rules and 3D visual perception,
we were able to create a system that has a high degree of autonomy
and flexibility underwater where things like GPS and WiFi are not
accessible.â
The research is published in Science
Robotics.
The fish-inspired robotic swarm, dubbed Blueswarm, was created
in the lab of Radhika
Nagpal, the Fred Kavli Professor of Computer Science at SEAS
and Associate Faculty Member at the Wyss Institute. Nagpalâs lab
is a pioneer in self-organizing systems, from their 1,000 robot
Kilobot swarm to their termite-inspired robotic construction
crew.
However, most previous robotic swarms operated in two-dimensional
space. Three-dimensional spaces, like air and water, pose
significant challenges to sensing and locomotion.
To overcome these challenges, the researchers developed a
vision-based coordination system in their fish robots based on blue
LED lights. Each underwater robot, called a Bluebot, is equipped
with two cameras and three LED lights. The on-board, fish-lens
cameras detect the LEDs of neighboring Bluebots and use a custom
algorithm to determine their distance, direction and heading. Based
on the simple production and detection of LED light, the
researchers demonstrated that the Blueswarm could exhibit complex
self-organized behaviors, including aggregation, dispersion and
circle formation.
Each Bluebot implicitly reacts to its neighborsâ positions,â
said Berlinger. âSo, if we want the robots to aggregate, then
each Bluebot will calculate the position of each of its neighbors
and move towards the center. If we want the robots to disperse, the
Bluebots do the opposite. If we want them to swim as a school in a
circle, they are programmed to follow lights directly in front of
them in a clockwise direction. â
The researchers also simulated a simple search mission with a
red light in the tank. Using the dispersion algorithm, the Bluebots
spread out across the tank until one comes close enough to the
light source to detect it. Once the robot detects the light, its
LEDs begin to flash, which triggers the aggregation algorithm in
the rest of the school. From there, all the Bluebots aggregate
around the signaling robot.
âOur results with Blueswarm represent a significant milestone
in the investigation of underwater self-organized collective
behaviors,â said Nagpal. âInsights from this research will help
us develop future miniature underwater swarms that can perform
environmental monitoring and search in visually-rich but fragile
environments like coral reefs. This research also paves a way to
better understand fish schools, by synthetically recreating their
behavior.â
The research was co-authored by Dr. Melvin Gauci, a former Wyss
Technology Development Fellow. It was supported in part by the
Office of Naval Research, the Wyss Institute for Biologically
Inspired Engineering, and an Amazon AWS Research Award.
Originally published by
Leah
Burrows | Press
contact | January 13, 2021
Harvard John A. Paulson
School of Engineering and Applied Sciences