Areas of Expertise (6)
Dr. John Christian is an engineer, researcher, and educator with expertise in spacecraft navigation, computer vision, sensor testing, and space systems. He is presently an Assistant Professor in the Department of Mechanical, Aerospace, and Nuclear Engineering (MANE) at Rensselaer Polytechnic Institute (RPI). He is also the director of RPI’s Sensing, Estimation, and Automation Laboratory (SEAL). Prior to his position at RPI, Dr. Christian was on the faculty at West Virginia University (WVU). Before joining academia, he worked as an engineer in the Guidance, Navigation, and Control (GNC) Autonomous Flight Systems Branch at the NASA Johnson Space Center (JSC).
Dr. Christian is an internationally-known expert in image-based spacecraft navigation. His algorithms for camera calibration and optical navigation will be used on the upcoming NASA Orion Exploration Mission 1. He was also one of the primary NASA analysts for the Sensor Test for Orion RelNav Risk Mitigation (STORRM) flight test of the Orion docking camera and LIDAR that flew aboard STS-134. Beyond these specific programs, he also has substantial experience with navigation system design, computer vision algorithms, large-scale ground test programs, Inertial Measurement Units (IMUs), and space systems analysis. He has been a part of multiple NASA Engineering Safety Center (NESC) independent assessment teams and is currently a member of the NESC GNC Technical Discipline Team.
Dr. Christian is the recipient of many awards, including an AFOSR Young Investigator Program award and the 2015 WVU Statler College New Researcher of the Year. He is an AIAA Associate Fellow and is active in numerous professional societies, including current roles as an associate editor of the AIAA Journal of Spacecraft and Rockets and as a member of the AAS Space Flight Mechanics Committee.
The University of Texas at Austin: Ph.D.
Georgia Institute of Technology: M.S.
Georgia Institute of Technology: B.S.
Media Appearances (1)
Exploring Our Solar System To Improve Life on Earth
Every Day Matters Blog online
Despite centuries of study, there’s still a lot we don’t know about the Earth. Part of the problem is that, until the beginning of the Space Age in the 1950s, all of our observations of Earth were from sensors on, or very near, the planet’s surface. The utility of space for gaining a better understanding of the natural world was apparent from the beginning, and many of the earliest spacecraft had mission objectives related to Earth or space science.
StarNAV: Autonomous Optical Navigation of a Spacecraft by the Relativistic Perturbation of StarlightSensors
Future space exploration missions require increased autonomy. This is especially true for navigation, where continued reliance on Earth-based resources is often a limiting factor in mission design and selection. In response to the need for autonomous navigation, this work introduces the StarNAV framework that may allow a spacecraft to autonomously navigate anywhere in the Solar System (or beyond) using only passive observations of naturally occurring starlight. Relativistic perturbations in the wavelength and direction of observed stars may be used to infer spacecraft velocity which, in turn, may be used for navigation. This work develops the mathematics governing such an approach and explores its efficacy for autonomous navigation. Measurement of stellar spectral shift due to the relativistic Doppler effect is found to be ineffective in practice. Instead, measurement of the change in inter-star angle due to stellar aberration appears to be the most promising technique for navigation by the relativistic perturbation of starlight.
Accurate Planetary Limb Localization for Image-Based Spacecraft NavigationJournal of Spacecraft and Rockets
2017 The use of images for spacecraft navigation is well established. Although these images have traditionally been processed by a human analyst on Earth, a variety of recent advancements have led to an increased interest in autonomous imaged-based spacecraft navigation. This work presents a comprehensive treatment of the techniques required to navigate using the lit limb of an ellipsoidal body (usually a planet or moon) in an image...
A Concise Guide to Feature Histograms with Applications to LIDAR-Based Spacecraft Relative NavigationThe Journal of the Astronautical Sciences
Rhodes, A., Christian, J.A., and Evans, T.
2017 With the availability and popularity of 3D sensors, it is advantageous to re-examine the use of point cloud descriptors for the purpose of pose estimation and spacecraft relative navigation. One popular descriptor is the oriented unique repeatable clustered viewpoint feature histogram (OUR-CVFH), which is most often utilized in personal and industrial robotics to simultaneously recognize and navigate relative to an object. Recent research into using the OUR-CVFH descriptor for spacecraft navigation has produced favorable results...
Pattern Design for 3D Point MatchingNAVIGATION: Journal of the Institute of Navigation
Robinson, S.B., and Christian, J.A.
2015 This paper presents an approach for designing 3D point patterns which are uniquely distinguishable in the absence of an a priori pose estimate. The principles for designing 3D point patterns are presented. Simple example patterns with analytic solutions are used to illustrate the approach...