A deep learning approach for optical autonomous planetary relative terrain navigation

Tanner Campbell, Roberto Furfaro, Richard Linares, David Gaylor

Research output: ResearchConference contribution

Abstract

Autonomous relative terrain navigation is a problem at the forefront of many space missions involving close proximity operations which does not have any definitive answer. There are many techniques to help cope with this issue using both passive and active sensors, but almost all require very sophisticated dynamical models. Convolutional Neural Networks (CNNs) trained with images rendered from a digital terrain map (DTM) can provide a way to side-step the issue of unknown or complex dynamics while still providing reliable autonomous navigation by directly mapping an image to position. The portability of trained CNNs allows offline training that can yield a matured network capable of being loaded onto a spacecraft for real-time position acquisition.

LanguageEnglish (US)
Title of host publicationSpaceflight Mechanics 2017
PublisherUnivelt Inc.
Pages3293-3302
Number of pages10
Volume160
ISBN (Print)9780877036371
StatePublished - 2017
Event27th AAS/AIAA Space Flight Mechanics Meeting, 2017 - San Antonio, United States
Duration: Feb 5 2017Feb 9 2017

Other

Other27th AAS/AIAA Space Flight Mechanics Meeting, 2017
CountryUnited States
CitySan Antonio
Period2/5/172/9/17

Fingerprint

navigation
learning
Navigation
Neural networks
Deep learning
Spacecraft
Sensors
spacecraft
sensor
autonomous navigation
space missions
proximity
acquisition
education
sensors

ASJC Scopus subject areas

  • Aerospace Engineering
  • Space and Planetary Science

Cite this

Campbell, T., Furfaro, R., Linares, R., & Gaylor, D. (2017). A deep learning approach for optical autonomous planetary relative terrain navigation. In Spaceflight Mechanics 2017 (Vol. 160, pp. 3293-3302). Univelt Inc..

A deep learning approach for optical autonomous planetary relative terrain navigation. / Campbell, Tanner; Furfaro, Roberto; Linares, Richard; Gaylor, David.

Spaceflight Mechanics 2017. Vol. 160 Univelt Inc., 2017. p. 3293-3302.

Research output: ResearchConference contribution

Campbell, T, Furfaro, R, Linares, R & Gaylor, D 2017, A deep learning approach for optical autonomous planetary relative terrain navigation. in Spaceflight Mechanics 2017. vol. 160, Univelt Inc., pp. 3293-3302, 27th AAS/AIAA Space Flight Mechanics Meeting, 2017, San Antonio, United States, 2/5/17.
Campbell T, Furfaro R, Linares R, Gaylor D. A deep learning approach for optical autonomous planetary relative terrain navigation. In Spaceflight Mechanics 2017. Vol. 160. Univelt Inc.2017. p. 3293-3302.
Campbell, Tanner ; Furfaro, Roberto ; Linares, Richard ; Gaylor, David. / A deep learning approach for optical autonomous planetary relative terrain navigation. Spaceflight Mechanics 2017. Vol. 160 Univelt Inc., 2017. pp. 3293-3302
@inbook{9470bf00035b4cf1950302bd47903d78,
title = "A deep learning approach for optical autonomous planetary relative terrain navigation",
abstract = "Autonomous relative terrain navigation is a problem at the forefront of many space missions involving close proximity operations which does not have any definitive answer. There are many techniques to help cope with this issue using both passive and active sensors, but almost all require very sophisticated dynamical models. Convolutional Neural Networks (CNNs) trained with images rendered from a digital terrain map (DTM) can provide a way to side-step the issue of unknown or complex dynamics while still providing reliable autonomous navigation by directly mapping an image to position. The portability of trained CNNs allows offline training that can yield a matured network capable of being loaded onto a spacecraft for real-time position acquisition.",
author = "Tanner Campbell and Roberto Furfaro and Richard Linares and David Gaylor",
year = "2017",
isbn = "9780877036371",
volume = "160",
pages = "3293--3302",
booktitle = "Spaceflight Mechanics 2017",
publisher = "Univelt Inc.",

}

TY - CHAP

T1 - A deep learning approach for optical autonomous planetary relative terrain navigation

AU - Campbell,Tanner

AU - Furfaro,Roberto

AU - Linares,Richard

AU - Gaylor,David

PY - 2017

Y1 - 2017

N2 - Autonomous relative terrain navigation is a problem at the forefront of many space missions involving close proximity operations which does not have any definitive answer. There are many techniques to help cope with this issue using both passive and active sensors, but almost all require very sophisticated dynamical models. Convolutional Neural Networks (CNNs) trained with images rendered from a digital terrain map (DTM) can provide a way to side-step the issue of unknown or complex dynamics while still providing reliable autonomous navigation by directly mapping an image to position. The portability of trained CNNs allows offline training that can yield a matured network capable of being loaded onto a spacecraft for real-time position acquisition.

AB - Autonomous relative terrain navigation is a problem at the forefront of many space missions involving close proximity operations which does not have any definitive answer. There are many techniques to help cope with this issue using both passive and active sensors, but almost all require very sophisticated dynamical models. Convolutional Neural Networks (CNNs) trained with images rendered from a digital terrain map (DTM) can provide a way to side-step the issue of unknown or complex dynamics while still providing reliable autonomous navigation by directly mapping an image to position. The portability of trained CNNs allows offline training that can yield a matured network capable of being loaded onto a spacecraft for real-time position acquisition.

UR - http://www.scopus.com/inward/record.url?scp=85030981115&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85030981115&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9780877036371

VL - 160

SP - 3293

EP - 3302

BT - Spaceflight Mechanics 2017

PB - Univelt Inc.

ER -