X
Innovation

Mars up close: NASA takes the biggest interplanetary selfie ever

How did they do it? You can't just whip out an iPhone or a Galaxy S20 on Mars and upload it to Instagram.
Written by Jason Perlow, Senior Contributing Writer

This week, NASA's Jet Propulsion Laboratories (JPL) released a spectacular, 1.8 billion pixel panorama of the surface of the planet Mars using the Curiosity rover. The final image -- which is a composite of more than 1,200 separate exposures taken over four days -- is so dense that visual elements 20 miles away can be seen in detail when zoomed in. 

To date, no image of this detail has ever been taken by a probe from a planet's surface. How did they do it? Obviously, on the clay-bearing, cratered surface of Mars, over 156 million miles away from Earth, the Curiosity rover cannot just pull out an iPhone 11 or a Galaxy S20, sweep its robotic arm, do a panorama shot, and then upload it via 4G LTE or even WiFi to Instagram. It's a bit more complicated. And restrictive.

Bandwidth is restricted 150 million miles away

First, let's talk bandwidth -- because without that, you aren't getting the pictures back to Earth. 

The Curiosity rover transmits data about four times per Mars day or "Sol," which is about 40 minutes longer than a day on Earth. It is assisted by four satellites that are orbiting the Red Planet -- the Odyssey, the MRO, the MAVEN, and TGO (which was launched as a joint mission of the European Space Agency and Roscosmos). Each of these spacecraft has its distinct missions, but they all can act as data relays for the Curiosity. 

The total amount of data that can be transferred from all four satellites is about 150 megabytes per Sol  -- yes, megabytes. That's the equivalent of approximately a dozen medium-quality 12-megapixel images stored on your iPhone or Android phone. That effective data rate is way, way lower than any smartphone data plan that any of you have. 

The actual transmissions between the rover and the orbiters are quite short, just 10-30 minutes per each of the four overflights -- they need to be quick because the orbiters move across the sky extremely fast -- over two miles per second or 7,200 mph. Data rates from the rover to the orbiters vary from a few tens of kilobits per second to over 3 megabits per second. Data rates from the orbiters range from a few hundred kilobits per second to 6 megabits per second, depending on the distance between Earth and Mars, and the specifics of the radios on the orbiters.

Because the Curiosity rover and the four spacecraft have primary, regular missions, the photographic data had to be transferred using the leftover bandwidth after the data from the daily ongoing science observations were transmitted. So the entire payload of photographs took a month to download from Mars.

Putting the image together

The panorama and processing itself, which was completed by Malin Space Science Systems (MSSS), is composed of two different mosaics. It was performed by the two cameras mounted on the rover's main mast and function as the probe's right and left "eyes." A total of 1,139 pictures were taken with the M-100 (100mm) camera and 1,189 with the M-34 camera (34mm). Roughly a quarter of a mosaic was acquired each day, in stacks of horizontal tiers. The starting time was offset for each day to permit the matching of adjacent mosaics. MSSS's Tex Kubacki elaborates:

I can add that the four pieces were acquired from top to bottom, each covering roughly 90 degrees of azimuth. They were 239, 262, 349, and 289 frames. The variability in number was due to rover tilt, surrounding topography, and avoidance of hardware. We tried to take them all fairly close to solar noon so that changes in sun position would not make a very big difference to scene illumination, and the four sections could be stitched together seamlessly.

After processing all those photos to stitch it all together (and discarding overlapping information), the M-100 mosaic is 88,797 pixels wide, by 22,958 high, and the M-34 mosaic is 29,164 pixels wide, by 8,592 high. The M-100 mosaic itself is equivalent to about 23 4K images wide, by 10 4K images high: You would need a massive screen to see the entire thing at full resolution.

But the raw data isn't sent back;  it must be compressed and optimized first, before transmission, because of the limited bandwidth of the orbiting spacecraft. Each 1200 x 1600 image, taken in greyscale in RAW, uncompressed mode, is then companding table compressed from 11-bits to 8-bits to reduce the file sizes. 

Color post-processing occurs using a Bayer Color Filter Array, in which additional data compression and data manipulation are performed. The raw images are compressed losslessly by about a factor of 2 (e.g., about 4 bits/pixel). Once decompressed from 4 bits back to 8 bits back at MSSS, each pixel is further expanded up to 11-bits (portrayed in a 16-bit word).

A lot of additional work goes into processing every single image before those mosaics are made. I will leave this to Michael Malin, founder, and CEO of MSSS, to explain this in detail.

Each image is radiometrically processed including correction for dark current (electrons from thermal effects), shutter smear (the electronic shutter is not completely opaque, so some small amount of light exposure is accumulated while the image is being shifted out of the detector), non-uniformity of the response of the detector and lens (called flat fielded, so that an image of a flat gray surface would be flat and gray), and bad pixels (from radiation effects from cosmic rays, and from particles from the rover RTG, blemishes on the detector, etc.).  The images are then "demosaiced" or "de-Bayered" by interpolation into 3 16-bit color channels (R, G, and B) so that each pixel is 48-bits deep. Color correction is then applied to adjust for variations in the sensitivity of the detector beneath each Bayer filter and is optionally further color balanced to create colors as they would look on Earth (for geological comparison). The images are also contrast-enhanced, as Mars is a very low contrast target. All of this is done on the 48 bit (6-byte) data. The images are then geometrically processed to remove lens distortions, and the rover position and remote sensing mast pointing used as seed values to register the images.
 
So the M-100 mosaic was 2.0386E9 X 6 bytes =12.2316E9 bytes = 97.8529E9 bits and the M-34 mosaic was 250.577E6 X 6 bytes = 1.50356E09 bytes = 12.0277E09 bits. 
 
Eventually, these mosaics were converted to 24-bit color representations (8-bits per color): M100 = 6.1158E09 bytes; M-34 = 751.731E06 bytes, the M-34 is 9x smaller.

Suffice to say; this is a bit more complicated than taking a smartphone panorama. JPL has high-end imaging workstations to make all of this short work, right? Nope.
 
Many systems and custom software developed at MSSS had to be used to do all the image processing. To make the actual mosaics, Malin reprocessed the data using his mid-2010 Macbook Pro, running Mac OS 10.6.8, which only has 8GB of RAM, and stored the images and intermediary products on an external FireWire 30TB G-SpeedQ drive array. 
 
His vintage Mac choked on the task, particularly when having to output the huge project files -- both the TIFF and PNG graphics formats have file size limitations he had to deal with, and Photoshop doesn't like to work with large files either. A Python package and extension called GDAL (Geospatial Data Abstraction Library), which Malin rebuilt for the task, is how the mosaics were able to get into a format for the final project, one frame at a time.
 
And that is how you make one heck of an interplanetary panoramic probe selfie.

Related coverage

Editorial standards