Connect with us

Science

How might JPL look for life on watery worlds? With the help of this slithering robot

Published

on

How might JPL look for life on watery worlds? With the help of this slithering robot

Engineers at NASA’s Jet Propulsion Laboratory are taking artificial intelligence to the next level — by sending it into space disguised as a robotic snake.

With the sun beating down on JPL’s Mars Yard, the robot lifts its “head” off a glossy surface of faux ice to scan the world around it. It maps its surroundings, analyzes potential obstacles and chooses the safest path through a valley of fake boulders to the destination it has been instructed to reach.

EELS raises its head unit to scan its surroundings.

(Brian van der Brug / Los Angeles Times)

Advertisement

Once it has a plan in place, the 14-foot-long robot lowers its head, engages its 48 motors and slowly slithers forward. Its cautious movements are propelled by the clockwise or counterclockwise turns of the spiral connectors that link its 10 body segments, sending the cyborg in a specific direction. The entire time, sensors all along its body continue to reevaluate the environs, allowing the robot to make adjustments if needed.

JPL engineers have created spacecraft to orbit distant planets and built rovers that rumble around Mars as though they’re commuting to the office. But EELS — short for Exobiology Extant Life Surveyor — is designed to go places that have never been accessible to humans or robots before.

The lava tubes on the moon? EELS could scope out the underground tunnels, which may provide shelter to future astronauts.

The polar ice caps on Mars? EELS would be able to explore them and deploy instruments to collect chemical and structural information about the frozen carbon dioxide.

The liquid ocean beneath the frozen surface of Enceladus? EELS could tunnel its way there and look for evidence that the Saturnian moon might be hospitable to life.

Advertisement

“You’re talking about a snake robot that can do surface traversal on ice, go through holes and swim underwater — one robot that can conquer all three worlds,” Rohan Thakker, a robotics technologist at JPL. “No one has done that before.”

And if things go according to plan, the slithering space explorer developed with grant money from Caltech will do all of these things autonomously, without having to wait for detailed commands from handlers at the NASA lab in La Cañada Flintridge. Though still years away from its first official deployment, EELS is already learning how to hone its decision-making skills so it can navigate even dangerous terrain independently.

Hiro Ono, leader of JPL’s Robotic Surface Mobility Group, started out seven years ago with a different vision for exploring Enceladus and another watery moon in orbit around Jupiter called Europa. He imagined a three-part system consisting of a surface module that generated power and communicated with Earth; a descent module that picked its way through a moon’s icy crust; and an autonomous underwater vehicle that explored the subsurface ocean.

EELS replaces all of that.

Black tube-like units are connected.

EELS has spiral treads for traction and multiple body segments for flexibility. Its design allows it to wiggle out of all sorts of tricky terrain.

(Brian van der Brug / Los Angeles Times)

Advertisement

Thanks to its serpentine anatomy, this new space explorer can go forward and backward in a straight line, slither like a snake, move its entire body like a windshield wiper, curl itself into a circle, and lift its head and tail. The result is a robot that can’t be stymied by deep craters, icy terrain or small spaces.

“The most interesting science is sometimes in places that are difficult to reach,” said Matt Robinson, the project manager for EELS. Rovers struggle with steep slopes and irregular surfaces. But a snake-like robot would be able to reach places such as an underground lunar cave or the near-vertical wall of a crater, he said.

The farther away a spacecraft is, the longer it takes for human commands to reach it. The rovers on Mars are remote-controlled by humans at JPL, and depending on the relative positions of Earth and Mars, it can take five to 20 minutes for messages to travel between them.

Enceladus, on the other hand, can be anywhere from 746 million to more than 1 billion miles from Earth. A radio transmission from way out there would take at least an hour to arrive, and perhaps as long as an hour and a half. If EELS found itself in jeopardy and needed human help to get out of it, its fate might be sealed by the time its SOS received a reply.

Advertisement

“Most people get frustrated when their video game has a few-second lag,” Robinson said. “Imagine controlling a spacecraft that’s in a dangerous area and has a 50-minute lag.”

That’s why EELS is learning how to make its own choices about getting from Point A to Point B.

A person points to a spot in the distance as other people watch behind laptops and a monitor on a table.

EELS autonomy lead Rohan Thakker, second from left, confers with engineers as they put the robot through its paces.

(Brian van der Brug / Los Angeles Times)

A computer screen with colorful marks on the left side and two images from two cameras on the right.

A computer screen shows EELS’ actual position as compared with its programmed position.

(Brian van der Brug / Los Angeles Times)

Advertisement

Teaching the robot how to assess its environment and make decisions quickly is a multi-step process.

First, EELS is taught to be safe. With the help of software that calculates the probability of failures — such as crashing into something or getting stuck — EELS is learning to identify potentially dangerous situations. For instance, it is figuring out that when something like fog interferes with its ability to map the world around it, it should respond by proceeding more cautiously, said Thakker, the autonomy lead for the project.

It also relies on its array of built-in sensors. Some can detect a change in its orientation with respect to gravity — the robot equivalent of feeling as though you’re falling. Others measure the stability of the ground and can tell whether hard ice suddenly turns into loose snow, so that EELS can maneuver itself to a more navigable surface, Thakker said.

In addition, EELS is able to incorporate past experiences into its decision-making process — in other words, it learns. But it does so a little differently than a typical robot powered by artificial intelligence.

Advertisement

For example, if an AI robot were to spot a puddle of water, it may investigate a bit before jumping in. The next time it encounters a puddle, it would recognize it, remember that it was safe and jump in.

But that could be deadly in a dynamic environment. Thanks to EELS’ extra programming, it would know to evaluate the puddle every single time— just because it was safe once doesn’t guarantee it will be safe again.

In the Mars Yard, a half-acre rocky sandbox used to test rovers, Thakker and the team assign EELS a specific destination. Then it’s up to the robot to use its sensors to scan the world around it and plot the best path forward, whether it’s directly in the dirt or on white mats made to mimic ice.

A long, black tube-like object lies on the ground, with two people standing in the background.

JPL engineers tested EELS on glossy mats that served as a stand-in for ice.

(Brian van der Brug / Los Angeles Times)

Advertisement

It’s similar to the navigation of a self-driving car, except there are no stop signs or speed limits to help EELS develop its strategy, Thakker said.

EELS has also been tested on an ice rink, on a glacier and in snow. With its spiral treads for traction and multiple body segments for flexibility, it can wiggle itself out of all sorts of tricky terrain.

A tube-like object appears between two rocks on the ground. A hand holds a plastic box and button in the foreground.

Mechanical engineer Sarah Yaericks holds the emergency stop as EELS moves around obstacles in the Mars Yard at JPL.

(Brian van der Brug / Los Angeles Times)

The robot isn’t the only one learning. As its human handlers monitor EELS’ progress, they adjust its software to help it make better assessments of its surroundings, Robinson said.

Advertisement

“It’s not like an equation you can just solve,” Ono said. “It can often be more of an art than science. … A lot comes from experience.”

The goal is for EELS to gain enough experience to be sent out on its own in any kind of setting.

“We aren’t there yet,” Ono said. But EELS’ recent advances amount to “one small step for the robot and one large step for mankind.”

Advertisement
Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Science

LAX passenger arrested after running onto tarmac, police say

Published

on

LAX passenger arrested after running onto tarmac, police say

A Los Angeles International Airport passenger was arrested early Saturday morning after he became irate and ran out of Terminal 4 onto the tarmac, according to airport police.

The passenger appeared to be experiencing a mental health crisis, said Capt. Karla Rodriguez. “Police responded and during their attempt in taking the suspect into custody, a use of force occurred,” she said.

The man, who was not identified, was arrested on suspicion of battery against a police officer and trespassing on airport property, she said. He was taken to a nearby hospital for a mental health evaluation.

A video obtained by CBS shows a shirtless man in black shorts running on the tarmac past an American Airlines jetliner with a police officer in pursuit. The officer soon tackles the man and pushes him down on the pavement.

Advertisement
Continue Reading

Science

Video: How SpaceX Is Harming Delicate Ecosystems

Published

on

Video: How SpaceX Is Harming Delicate Ecosystems

On at least 19 occasions since 2019, SpaceX’s operations have caused fires, leaks and explosions near its launch site in Boca Chica, Texas. These incidents reflect a broader debate over how to balance technological and economic progress against protections of delicate ecosystems and local communities. The New York Times investigative reporter Eric Lipton explains.

Continue Reading

Science

Live poultry markets may be source of bird flu virus in San Francisco wastewater

Published

on

Live poultry markets may be source of bird flu virus in San Francisco wastewater

Federal officials suspect that live bird markets in San Francisco may be the source of bird flu virus in area wastewater samples.

Days after health monitors reported the discovery of suspected avian flu viral particles in wastewater treatment plants, federal officials announced that they were looking at poultry markets near the treatment facilities.

Last month, San Francisco Public Health Department officials reported that state investigators had detected H5N1 — the avian flu subtype making its way through U.S. cattle, domestic poultry and wild birds — in two chickens at a live market in May. They also noted they had discovered the virus in city wastewater samples collected during that period.

Two new “hits” of the virus were recorded from wastewater samples collected June 18 and June 26 by WastewaterSCAN, an infectious-disease monitoring network run by researchers at Stanford, Emory University and Verily, Alphabet Inc.’s life sciences organization.

Nirav Shah, principal deputy director of the U.S. Centers for Disease Control and Prevention, said that although the source of the virus in those samples has not been determined, live poultry markets were a potential culprit.

Advertisement

Hits of the virus were also discovered in wastewater samples from the Bay Area cities of Palo Alto and Richmond. It is unclear if those cities host live bird markets, stores where customers can take a live bird home or have it processed on-site for food.

Steve Lyle, a spokesman for the state’s Department of Food and Agriculture, said live bird markets undergo regular testing for avian influenza.

He said that aside from the May 9 detection in San Francisco, there have been no “other positives in Live Bird Markets throughout the state during this present outbreak of highly-pathogenic avian flu.”

San Francisco’s health department referred all questions to the state.

Even if the state or city had missed a few infected birds, John Korslund, a retired U.S. Department of Agriculture veterinarian epidemiologist, seemed incredulous that a few birds could cause a positive hit in the city’s wastewater.

Advertisement

“Unless you’ve got huge amounts of infected birds — in which case you ought to have some dead birds, too — it’d take a lot of bird poop” to become detectable in a city’s wastewater system, he said.

“But the question still remains: Has anyone done sequencing?” he said. “It makes me want to tear my hair out.”

He said genetic sequencing would help health officials determine the origin of viral particles — whether they came from dairy milk, or from wild birds. Some epidemiologists have voiced concerns about the spread of H5N1 among dairy cows, because the animals could act as a vessel in which bird and human viruses could interact.

However, Alexandria Boehm, professor of civil and environmental engineering at Stanford University and principal investigator and program director for WastewaterSCAN, said her organization is not yet “able to reliably sequence H5 influenza in wastewater. We are working on it, but the methods are not good enough for prime time yet.”

A review of businesses around San Francisco’s southeast wastewater treatment facility indicates a dairy processing plant as well as a warehouse store for a “member-supported community of people that feed raw or cooked fresh food diets to their pets.”

Advertisement
Continue Reading

Trending