We met Kevin Gill, software engineer at NASA’s Jet Propulsion Laboratory in Pasadena (California). In addition to working on data visualization and analysis projects, Gill has a great passion for computer-generated astronomical art and planetary imaging. His splendid creations can be seen on his Flickr page (click here).
How was your passion for astronomical art born? And how do you manage to create such realistic and beautiful photos of Mars?
I have always had a love for the space sciences and art from when I was a small child. When I was a kid, one of my wishes was to become a professional astronomer. Life didn’t turn out that way as I ended up going into the military then to software engineering. Then as I was completing my graduate studies in computer science, I wrote as my final thesis a software renderer for digital terrain modeling. A colleague of mine suggested I use that software to generate Mars visualizations. From then on, I had been moving more and more into a planetary science and astrophysics focus in how I apply my engineering skillsets. As I learned more about planetary imaging, I was able to leverage that engineering knowledge to be able to do more with it then the ‘direct through Photoshop’ route by generating my own pipelines and calibration routines. Additionally, the more I work with the data, the more I learn or discover new ways of working with it by bringing in tools that aren’t generally used together such as GIS and 3D production packages.
Can your very realistic and detailed artistic creations help engineers and scientists for the next missions to Mars?
The images I create, specifically with HiRISE digital altimetry data, can be and is used to provide a human-readable visualization of the local terrain at a high level which can broaden a person’s scientific understanding as to the viability of a landed mission in that location.
As for the heights of the mountains, the reliefs and the colors of the Martian landscape, how do you know the dimensions and the true colors?
The dimensions are encoded in the data, though I tend to exaggerate the relief some for the sake of the visualization. Color is more subjective since I use grayscale imagery. I will reference the RGB portion of the orbital imagery that may exist, but I will also make comparisons with ground imagery such as those from the Curiosity Rover’s MastCam or Perseverance’s MastCam-Z. But I also need the color to be realistic to the human eye, so I will also often just look out the window and use the local scenery for color inspiration, minus the visible fauna of course.
You are a software engineer at NASA’s Jet Propulsion Laboratory. What does your work consist of?
I am a software and spaceflight engineer at JPL. Initially, I was brought on to work on various Earth science and climate projects such as the NASA Sea Level Change Portal’s Data Analysis Tool. More recently, I have been tasked with working the engineering cameras on the Curiosity Rover, and downlink data analyst for Curiosity and InSight. On occasion during my time at JPL, I have also been tapped to provide image processing for various outreach projects, such as for Cassini’s Grand Finale, Mariner 10 Venus imagery, Voyager 1’s Pale Blue Dot Revisited, etc.
What is the place in the solar system that has excited you the most, and that you would like to photograph live?
That’s so very hard to say, and strongly depends on when I’m asked that based on whatever data I’m currently working on. As much as I love to feel the crunch of Martian rocks below my feet or explore the dunes and seas of Titan, the one thing I have the hardest time visualizing in my head is being in an atmosphere with no bottom. That is to explore Jupiter, or any gas giant, and get that full 360˚ look around, go between the cloud decks and see the interactions of the belt and zones.
- Cover image credits: NASA/JPL-Caltech (picture on the right); NASA/JPL-Caltech/MSSS (picture on the left); Kevin Gill (picture in the center)