Galloway’s survey team had the unique opportunity to utilize their drone capabilities to collect data for a simulated lunar landing. Just prior to the stay-at-home orders, Galloway’s team, along the engineers from Advance Solutions, Inc. (ASI) flew to remote Hanksville, UT, to collect real-time video of terrain representative of the lunar surface.
This video and telemetry data will be used to help validate the vision navigation system used in a lunar lander simulator being developed for NASA’s NEXTStep Human Landing System (HLS) Descent/Ascent Element Study.
Galloway deployed a DJI Phantom 4 Pro drone using a Litchi app that allowed the team to set computer-controlled flight paths, camera gimbal settings, and video start and stop points. Airdata UAV was used to compute the aerial post processing and extract the flight telemetry data so that ASI could sync the time stamp to the video footage.
“The vision processing algorithms in the simulator rely on computer-generated video of the lunar surface based on actual satellite imagery. Independent validation of these algorithms using actual video of representative lunar terrain is extremely important in the acceptance testing of the simulator,” according to Jeff Szmyd, the lead software designer for the simulator. “Being able to utilize the drone expertise of Galloway was crucial in generating this video in a timely manner so that ASI could process it and deliver the results to our customer.”
The flights were choreographed in the early afternoon and then recorded during dusk to maximize a short window of low sun angle that would produce long shadows representative of the landing area on the lunar surface.
Significant planning was put into devising the best approach that successfully addressed all the project’s parameters, and it was a great opportunity for Galloway’s drone team to expand their capabilities. “It was a really fun and satisfying project,” said Perry Bassett, Galloway survey technology manager.