“Xenobots” are tiny robots made from living cells – self-healing swarms that can form bodies… and memories.

Eurekalert introduces the next generation of living robots. Tufts University biologists and computer scientists have created a second generation of tiny, biological machines they’re calling “xenobots”:

Compared to Xenobots 1.0, in which the millimeter-sized automatons were constructed in a “top down” approach by manual placement of tissue and surgical shaping of frog skin and cardiac cells to produce motion, the next version of Xenobots takes a “bottom up” approach. The biologists at Tufts took stem cells from embryos of the African frog Xenopus laevis (hence the name “Xenobots”) and allowed them to self-assemble and grow into spheroids, where some of the cells after a few days differentiated to produce cilia – tiny hair-like projections that move back and forth or rotate in a specific way. Instead of using manually sculpted cardiac cells whose natural rhythmic contractions allowed the original Xenobots to scuttle around, cilia give the new spheroidal bots “legs” to move them rapidly across a surface. In a frog, or human for that matter, cilia would normally be found on mucous surfaces, like in the lungs, to help push out pathogens and other foreign material. On the Xenobots, they are repurposed to provide rapid locomotion.

While the Tufts scientists created the physical organisms, scientists at UVM were busy running computer simulations that modeled different shapes of the Xenobots to see if they might exhibit different behaviors, both individually and in groups. Using the Deep Green supercomputer cluster at UVM’s Vermont Advanced Computing Core, the team, led by computer scientists and robotics experts Josh Bongard and under hundreds of thousands of random environmental conditions using an evolutionary algorithm. These simulations were used to identify Xenobots most able to work together in swarms to gather large piles of debris in a field of particles.

“We know the task, but it’s not at all obvious — for people — what a successful design should look like. That’s where the supercomputer comes in and searches over the space of all possible Xenobot swarms to find the swarm that does the job best,” says Bongard. “We want Xenobots to do useful work. Right now we’re giving them simple tasks, but ultimately we’re aiming for a new kind of living tool that could, for example, clean up microplastics in the ocean or contaminants in soil.”

It turns out, the new Xenobots are much faster and better at tasks such as garbage collection than last year’s model, working together in a swarm to sweep through a petri dish and gather larger piles of iron oxide particles.

A central feature of robotics is the ability to record memory and use that information to modify the robot’s actions and behavior. With that in mind, the Tufts scientists engineered the Xenobots with a read/write capability to record one bit of information, using a fluorescent reporter protein called EosFP, which normally glows green. However, when exposed to light at 390nm wavelength, the protein emits red light instead.

The cells of the frog embryos were injected with messenger RNA coding for the EosFP protein before stem cells were excised to create the Xenobots. The mature Xenobots now have a built-in fluorescent switch which can record exposure to blue light around 390nm.

The researchers tested the memory function by allowing 10 Xenobots to swim around a surface on which one spot is illuminated with a beam of 390nm light. After two hours, they found that three bots emitted red light. The rest remained their original green, effectively recording the “travel experience” of the bots.

This proof of principle of molecular memory could be extended in the future to detect and record not only light but also the presence of radioactive contamination, chemical pollutants, drugs, or a disease condition. Further engineering of the memory function could enable the recording of multiple stimuli (more bits of information) or allow the bots to release compounds or change behavior upon sensation of stimuli.

There’s video at the link and the Science Robotics paper here.