Laden with 700 kg of scientific equipment, the autonomous, solar-powered Mayflower is arguably the world’s most cutting-edge ship.
It’s fast. It’s clever. It’s a robot – and it’s about to cross the Atlantic.
Laden with 700 kg of scientific equipment, Mayflower is arguably the world’s most cutting-edge ship. Fully autonomous, solar-powered and without any crew, it’s getting ready to sail from Plymouth in the UK to Plymouth, Massachusetts, in the US. On its two-week, 3,000-mile journey, the sleek 15m-long trimaran will be studying the ocean, its inhabitants and the water composition.
Its captain is artificial intelligence, and it’ll be navigating the seas with no human intervention whatsoever. The ship’s computer system runs thanks to solar panels and lithium ion-phosphate batteries, also vital for Mayflower’s electric propulsion motors powering it. On-board cameras constantly scan the surroundings for threats, with the AI ‘captain’ having been trained to recognize a variety of objects, from swimmers to boats to animals.
While it has no human passengers, Mayflower’s scientific tools and sensors should shed more light on the mysteries of the ocean. Rosie Lickorish, a UK-based software engineer at IBM, is one of the researchers who has helped shape Mayflower’s science mission.
It’s not Rosie’s first ocean-related work – before the Mayflower Autonomous Ship project, Rosie studied coral reefs in Mexico and later did climate modelling. With Mayflower, she is able to combine her passion for software development, helping to save the planet, and understand the ocean just a tiny bit more. After all, so far we’ve only studied about five percent of the vastness of the deep blue sea, despite it covering more than 70 percent of the surface of the Earth. We know Mars better than we know our own ocean – and that, even before we started flying helicopters remotely on the Red Planet.
Mayflower can help. One of the ‘passengers’ – or shall I say robotic crew members? – is HyperTaste. Its designers, IBM researchers led by Patrick Ruch in Zurich, Switzerland, call it a kind of ‘electronic tongue.’ The size and shape of a citrus slice, it can ‘taste’ liquids and, unlike the human tongue, can be trained using AI to rapidly and autonomously determine their contents. It can even distinguish vintage red wines or other drinks from fake substitutes.
It won’t be tasting any booze on Mayflower though, as its goal there is perhaps somewhat more noble. Instead, it will aim to verify the health of the ocean and its role in climate change. After all, the ocean absorbs about a third of all carbon dioxide released into the atmosphere. Continuously increasing levels of CO₂ lead to ocean acidification, with profound implications for marine life and the ocean carbon cycle. HyperTaste will analyze all of these factors.
From tiny to huge – while HyperTaste will be focusing on ocean chemistry, other tools on Mayflower will be listening for whale song underwater. Not only the ‘robotic lab’ is able to differentiate the sounds of whales from those of dolphins and ships, it can also assess how many of these marine mammals it encounters along the way. The estimates should help us better understand the overall population of mammals in the ocean. Future generations of Mayflower could even have underwater cameras eyeing the depths for yet unknown species.
And even later in the future, autonomous and non-autonomous ships alike could get a helping hand in choosing the best route from point A to point B – from a quantum computer, when this technology comes of age.
But that’s the future. Today, let’s get ready to follow Mayflower’s progress on its journey from the UK to the US. And let’s hope it gets there without too many unforeseen adventures.