by Rod Pyle
And that was that. There was a Q&A session, but Grotzinger had summed it up. We have found something that is interesting and might be important. If it does not turn out to be something that hitched a ride from Earth and if the measurement can be repeated, and if it does not turn out to be something that fell from the sky as opposed to something of true Martian origin, it will be very interesting. If the above conditions hold true, we would then have to try to figure out whether or not the measured organic material is of biological origin. A lot of conditions need to be satisfied, by a lot of people working on a lot of instruments, with repeatability, to get a definitive answer. If that does happen, we will let you know. But it takes time, it takes precision, and it takes patience. So please try to embrace the latter.
The “Great Martian Mystery” was over, the press walked—if they had been responsible in their reportage—or slunk, if they had not—back to file their stories. It was interesting to watch the responses. Some outlets simply reported the story of the day's proceedings as if they themselves had never been a part of the frenzy, others related it to the previous week's ride, and still others made it all sound like NASA's fault. The media can be an ungrateful and mean-spirited bunch, Fortunately the last group was small and comprised of mostly lesser outlets.
But there would be more media quicksand ahead for Curiosity.
PRESS RELEASE
12.03.2012
Source: Jet Propulsion Laboratory
NASA Mars Rover Fully Analyzes
First Martian Soil Samples
PASADENA, Calif. —NASA's Mars Curiosity rover has used its full array of instruments to analyze Martian soil for the first time, and found a complex chemistry within the Martian soil.
Water and sulfur and chlorine-containing substances, among other ingredients, showed up in samples Curiosity's arm delivered to an analytical laboratory inside the rover.
Detection of the substances during this early phase of the mission demonstrates the laboratory's capability to analyze diverse soil and rock samples over the next two years. Scientists also have been verifying the capabilities of the rover's instruments.
Curiosity is the first Mars rover able to scoop soil into analytical instruments. The specific soil sample came from a drift of windblown dust and sand called “Rocknest.” The site lies in a relatively flat part of Gale Crater still miles away from the rover's main destination on the slope of a mountain called Mount Sharp. The rover's laboratory includes the Sample Analysis at Mars (SAM) suite and the Chemistry and Mineralogy (CheMin) instrument. SAM used three methods to analyze gases given off from the dusty sand when it was heated in a tiny oven. One class of substances SAM checks for is organic compounds—carbon-containing chemicals that can be ingredients for life.
“We have no definitive detection of Martian organics at this point, but we will keep looking in the diverse environments of Gale Crater,” said SAM Principal Investigator Paul Mahaffy of NASA's Goddard Space Flight Center in Greenbelt, Md.
Curiosity's APXS instrument and the Mars Hand Lens Imager (MAHLI) camera on the rover's arm confirmed Rocknest has chemical-element composition and textural appearance similar to sites visited by earlier NASA Mars rovers Pathfinder, Spirit and Opportunity.
Curiosity's team selected Rocknest as the first scooping site because it has fine sand particles suited for scrubbing interior surfaces of the arm's sample-handling chambers. Sand was vibrated inside the chambers to remove residue from Earth. MAHLI close-up images of Rocknest show a dust-coated crust one or two sand grains thick, covering dark, finer sand.
“Active drifts on Mars look darker on the surface,” said MAHLI Principal Investigator Ken Edgett, of Malin Space Science Systems in San Diego. “This is an older drift that has had time to be inactive, letting the crust form and dust accumulate on it.”
CheMin's examination of Rocknest samples found the composition is about half common volcanic minerals and half noncrystalline materials such as glass. SAM added information about ingredients present in much lower concentrations and about ratios of isotopes. Isotopes are different forms of the same element and can provide clues about environmental changes. The water seen by SAM does not mean the drift was wet. Water molecules bound to grains of sand or dust are not unusual, but the quantity seen was higher than anticipated.
SAM tentatively identified the oxygen and chlorine compound perchlorate. This is a reactive chemical previously found in arctic Martian soil by NASA's Phoenix Lander. Reactions with other chemicals heated in SAM formed chlorinated methane compounds—one-carbon organics that were detected by the instrument. The chlorine is of Martian origin, but it is possible the carbon may be of Earth origin, carried by Curiosity and detected by SAM's high sensitivity design.
“We used almost every part of our science payload examining this drift,” said Curiosity Project Scientist John Grotzinger of the California Institute of Technology in Pasadena. “The synergies of the instruments and richness of the data sets give us great promise for using them at the mission's main science destination on Mount Sharp.”
NASA's Mars Science Laboratory Project is using Curiosity to assess whether areas inside Gale Crater ever offered a habitable environment for microbes. NASA's Jet Propulsion Laboratory in Pasadena, a division of Caltech, manages the project for NASA's Science Mission Directorate in Washington, and built Curiosity.
For more information about Curiosity and other Mars missions, visit: http://www.nasa.gov/mars.
Fig. 25.1. FROM THE PRESS CONFERENCE: The Mars Hand Lens Imager (MAHLI) on NASA's Mars rover Curiosity acquired close-up views of sands in the “Rocknest” wind drift to document the nature of the material that the rover scooped, sieved, and delivered to the Chemistry and Mineralogy Experiment (CheMin) and the Sample Analysis at Mars (SAM) in October and November 2012. Image from NASA/JPL-Caltech/MSSS.
When people think about driving a Mars rover, some see an image of a scruffy young fellow joysticking to a video-screen spread of Mars as speed-metal plays at blistering volume through his Dre Beats. Others see Big Science (cue the reverb) at a mission control center with endless consoles, blinky lights from a rerun of Lost in Space, and white lab coats (for the Millennials among you, I don't have an equivalent, because thankfully these kinds of goofy props went away in the 1990s…except for maybe Doctor Who sets). But, as we have seen, neither is quite how it works. The delay from Mars at the best of times is profound and averages about twelve minutes each way. Real-time driving is an exercise in patience, some frustration, and slow progress. That's why Curiosity has a high degree of autonomy and some very sophisticated software and computing capabilities that allow it to plot its own course, avoid obstacles, and even begin science operations once it is near a target.
However, there is still an enormous human component. As we learned during the tactical-planning meeting we sat in on, lots of thinking and discussion must take place before the smallest move, and an almost-unbelievable attention to detail precedes every action. A seemingly tiny item, once overlooked, can cause a cascade of consequences, and nobody wants to be responsible for that. So there will be no real-time bumper-car driving, thank you.
So who drives the rover? What does it entail? What about the robotic arm? How do you tell it where to go and what to do? All fine questions, and I had the good fortune to talk to a couple of the stars in JPL's driving and operations team to get the answers.
Brian Cooper is one of the originals—it's said he has the first Martian driver's license, since he drove the Sojourner rover during the Pathfinder mission. He's in his forties, is married, and has one daughter whose second middle name is Sojourner. Now that's company spirit. He has been at the lab a very long time, and in his spare time he studies computer gaming code and plays the games as well (you gotta be well-rounded, after all). We talked in front of a computer that he uses to write program code to enable Curiosity's drives across Mars.
“I was the first rover driver on Sojourner, so I guess I'm one of the originals,” he cracks. “
I also worked on MER, Spirit and Opportunity. That was great. At some point I stopped for a couple of years to develop this tool we are looking at, that is, to modify the tools we are using to drive the rover. It's called RSVP, which stands for Rover Sequencing and Visualization Program.”
Fig. 26.1. RSVP: It's not a party invite—it's software that Curiosity's drivers use to program the rover's travels across Mars. In this image, the rover is superimposed over a photo of the nearby terrain, with lines leading back from its wheels to indicate the track from where it came. Image from NASA/JPL-Caltech.
I ask him what a typical day is like for a rover driver: “Well, we have this notion of a yellow brick road, if you will—where we are and where we want to go. What I look at every day is these two screens. One is a screen that has commands that we send the rover—we can type in commands and values that get turned into binary bits and are sent up to the rover. This is where they happen…” He points out computer software that looks like something from the Windows 3.1 era but is in fact written in modern Unix. It is not fancy looking because there is no reason for it to be. “And over here is [a] dictionary that has everything that we can tell the rover to do. There are many thousands of commands.”
I look over the scrolling list that would probably stretch out to the parking lot if printed. How the heck do they keep track of what command to use? “We use this search box here”—that should have been obvious, I suppose—“and it defines certain classes of commands, for instance, mobility. You can also constrain things you are seeking.” It still looks complicated, but then again, I have no background to judge by. Is it a tough learn for the newbies? “Well, before anyone can sit in the seat, they have had a lot of training. So it's not often that you come upon a command that you haven't used before.”
As the programmers come up with new commands for new needs, they get added to the huge list. It's a pretty elegant and almost-simple setup once you understand what it is designed to do. Of course, just because it appears to be straightforward from the operator's point of view does not mean that the process is not immensely complex.
I ask him to show me how they prepare for the daily drive, usually scheduled for the following day: “Well, what we have here is a 3D representation of the rover as it is right now.” He points to a virtual rendering of the rover sitting on the part of Gale Crater it is currently driving across. You can zoom in or out, see an overhead view, look from the POV of the rover, and much more. It is very cool. “If you look here you see the tracks that the rover made on the ground during yesterday's drive.” I ask how they understand the true shape of the surface. “These surface maps are created by taking images from various cameras on the rover, which are provided to us in stereo pairs, and then you can run machine algorithms on them to determine the depth of the terrain.” This is done by evaluating color and shadow with complex software designed specifically for this task. “Then, once you have that, you build up a mesh and then colorize it. There's also orbital data in there. This gives us the view from up close and then we can switch to other views that we might've loaded.” OK, got it.
It is not surprising when I later find out that he has experience in, and a great passion for, computer gaming. The software is not dissimilar. “We're certainly not state-of-the-art compared the latest game engines,” he says. “But I've always had an interest and expertise in computer graphics, and I like games, so I've followed that research over the years. I tend to think in terms of imagery and spatial relations, so for me this is intrinsically fun.” He points back to a command-line-driven software he showed me earlier. “On the other hand, the author of the RoSE [for Rover Sequence Editor: it controls the graphic user interface or GUI] program—right here—is a dual computer science and English major. He's not fascinated by the 3D imagery, he's fascinated by words. He likes prose and syntax, and what he has created is very elegant.” It truly takes all kinds, and not everyone here started as a hard-core space guy. I'd guess that roughly half of the people I've met at JPL felt destined to do this since childhood; the rest came to it somewhere late in college or thereafter.
He begins to plot out tomorrow's drive as I watch. “This software is an improved version of what we used on MER. For instance, having this terrain model from the HIRISE orbiter camera is a new feature.” HIRISE is the high-resolution camera on the Mars Reconnaissance Orbiter. “We didn't have that on MER, so their maps had a flat plane instead of a 3D model. So this improvement is very useful, as it tells us where we've been and where we are going, and things we want to avoid, like craters. I can exaggerate the terrain up to five times, which is kind of wacky, but if we are not quite sure of what the terrain ahead is like you can exaggerate it and instantly see certain features.” He moves the mouse, and the terrain suddenly bulges from normal-looking to some kind of psychedelic cartoon—I wonder if this is what my classmates in high school who spent so much time in semicatatonic drugged states experienced. It would explain a few things. “Then you can see where the hazards are. So the shading is based on whichever camera they're using and the time of day and lighting and so forth.” It is truly amazing software, given that it is utilizing data from an orbiter a couple hundred miles up and from small, ground-based cameras on the rover.
But just knowing—and being able to see—the terrain is not really enough. Rovers are very sensitive to the slopes and the angles the landscape imparts on the machine, so the drivers also need to have an idea of how the rover will interact with it. I ask how this is accomplished. It took a bit of time to explain to me, but I'll do my best to condense it. It's cool and worth it, so bear with me.
Brian changes views on the screen, and what is up now is a wider view of the 3D model of the terrain ahead of Curiosity. The virtual rover is no longer in view and has been replaced by a little red cone, which he points to. “This little 3D cone is called the point normal cursor. So you turn it on, and when you hit this feature”—he performs a lightning-fast keystroke—“you can drag the cursor around the landscape, and it actually hugs the terrain.” It looks like an inverted traffic cone, as he drags it from point to point, and wherever he “drops” it, the cone magnetically snaps to the nearest surface feature. “If there is a rock or an incline, it will tilt. It's defining the normal to the surface.” Wherever the cone gets placed, it deviates from vertical to indicate the severity of the slope it is placed on. “The important thing is that we can get a sense of where the rover is and what the terrain around it is like, and whether it is safe to drive. So I can move it around…notice how it's changing its tilt as I move? We're going up little slope here. It's kind of like being a kid with a Tonka truck. This is very useful in particular for planning arm motion so we can place the arm sensors right on these targets.” He can define a spot where the cone has been and drop what looks like a smartphone app pushpin. “It's another way that we communicate between the science team and the rover planners.”
As he demonstrates the software, I notice that there are grayed-out areas, regions with no detail. “Those are areas of which we have no knowledge. It's because when the image was taken from this point of view it was obstructed.” So previously, a rock or rise in the ground might have blocked the Mastcam's view. “We couldn't see what was behind the obstruction. That's something you have to take into consideration—do you want drive in that area, with no knowledge or it, or do you go around? The answer is usually to go around unless you can use data from other sensors.”
Once again, safety first.
So how is all this data and the software actually used? In general, they want the rover to drive itself autonomously when it's safe for it to do so. Between the long delays of commands going up and results coming back down, as well as the limited alignments of the relay orbiters, real-time joystick driving would be horribly slow. Better to tell it to just drive on its own, based on parameters and goals entered into the RSVP software.
“We can tell the rover to do autonomous driving. That's how we drive long distances. We typically do
what's called a blind drive up to forty or sixty meters, depending on how flat the area is. Then we will tack on an extra forty or so meters. Say we are trying to do a one-hundred-meter drive for the day. The first forty might be in autonomous mode, where I'm telling it to go here”—he points to a rock on the map—“but I don't care how you get there. We've already programmed in things that are considered hazards, for instance, what's considered a rock to avoid and things of that nature. The downside is it takes longer for the rover to do that because it will take a short step, say, a meter, take more images, process them on board, and build an internal map. And all this occurs while it is stopped.” The rover, once it has received instructions and a basic map from JPL, needs to refresh that map as it moves. So the sequence is: drive, stop, shoot pictures, insert images into the software, interpret them, then verify the course ahead or generate a new one before driving farther. Rinse and repeat. And it's amazingly smart software for the small and comparatively slow computer Curiosity carries on board. “If you put Curiosity in a maze, it can actually find its way out by using this algorithm called D-STAR. It knows how not to get stuck in a cul-de-sac. That's how we can let it drive beyond the areas where we have high confidence. It often knows better than we do.” I could have used this thing's brain the last time I parked at Disneyland.