for the past 15 years, NASA’s Mars Reconnaissance Orbiter has been studying its climate and geology around the red planet. Each day, the orbiter sends back a treasure trove of images and other sensor data that NASA scientists have used to scout for safe landing sites for rovers and understand the distribution of water ice on the planet. Of particular interest to scientists are crater photographs of orbiters, which may provide a window into the planet’s deep history. NASA engineers are still working on a mission to return samples from Mars; Without rocks that would help them check remote satellite data with conditions on the surface, they should make a lot of educated guesses to determine the age and composition of each crater.
For now, they need other methods to manipulate that information. One tried and true method is to separate the ages of the oldest craters from the characteristics of the latest people on the planet. Since scientists can know the age of some recent impact sites in a few years or within a few weeks – they can use them as a baseline to determine the age and composition of older craters. They are having problems. Coming through a planet’s worth of image data in search of naughty signs of fresh effects is a tedious task, but it’s exactly the type of problem an AI created to solve.
At the end of last year, NASA researchers first used machine-learning algorithms to search for fresh Martian craters. AI discovered dozens of them hidden in image data from the Mars Reconnaissance Orbiter and suggested a new way of studying the planets in our solar system. “From a science perspective, it’s exciting because it’s increasing our knowledge of those characteristics,” says Kiri Wagstaff, a computer scientist at NASA’s Jet Propulsion Laboratory and a leader of the research team. “The data was there all the time, it’s just that we didn’t see it ourselves.”
The Mars Reconnaissance Orbiter has three cameras, but Wagstaff and his colleagues trained their AI using images from their Context and Highrise imagers. The reference is a grayscale camera with relatively low resolution, while HiRISE uses the largest reflecting telescope ever sent in deep space to produce images with approximately three times higher resolution than those used on Google Maps.
First, AI was given nearly 7,000 orbiter photographs of Mars — some with pre-discovered craters and others — to teach the algorithm how to detect a recent strike. After being able to accurately locate craters in the classifier training set, Wagstaff and his team loaded the algorithm on a supercomputer at the Jet Propulsion Laboratory and used it to comb through a database of more than 112,000 images from the orbiter .
“There’s nothing new in the underlying machine-learning technology,” says Wagstaff. “We used a standardized standard network to analyze image data, but it is still a challenge to implement it on a large scale. It was one of the things that we had to fight here. “
The most recent craters on Mars are small and can be only a few feet apart, meaning that they appear as deep pixelated bloats on contact images. If the algorithm compares the image of the candidate pit with an earlier picture from the same region and finds that it is missing the dark patch, then there is a good chance that a new pit has been found. The date of the earlier image also helps to establish the time when the effect occurs.
Once the AI had identified some promising candidates, NASA researchers were able to confirm that the craters were indeed present with the orbiter’s high-resolution camera. Last August, the team received its first confirmation when the orbiter photographed a group of craters identified by the algorithm. This was the first time an AI had discovered a pit on another planet. “There was no guarantee that new things would happen,” Wagstaff says. “But there were so many of them, and our big question is, what makes them hard to find?”