Space Object classification using deep Convolutional Neural Networks

Richard Linares, Roberto Furfaro

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

Tracking and characterizing both active and inactive Space Objects (SOs) is required for protecting space assets. Characterizing and classifying space debris is critical to understanding the threat they may pose to active satellites and manned missions. This work examines SO classification using brightness measurements derived from electrical-optical sensors. The classification approach discussed in this work is data-driven in that it learns from data examples how to extract features and classify SOs. The classification approach is based on a deep Convolutional Neural Network (CNN) approach where a layered hierarchical architecture is used to extract features from brightness measurements. Training samples are generated from physics-based models that account for rotational dynamics and light reflection properties of SOs. The number of parameters involved in modeling SO brightness measurements make traditional estimation approaches computationally expensive. This work shows that the CNN approach can efficiently solve classification problem for this complex physical dynamical system. The performance of these strategies for SO classification is demonstrated via simulated scenarios.

Original languageEnglish (US)
Title of host publicationFUSION 2016 - 19th International Conference on Information Fusion, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1140-1146
Number of pages7
ISBN (Electronic)9780996452748
StatePublished - Aug 1 2016
Event19th International Conference on Information Fusion, FUSION 2016 - Heidelberg, Germany
Duration: Jul 5 2016Jul 8 2016

Other

Other19th International Conference on Information Fusion, FUSION 2016
CountryGermany
CityHeidelberg
Period7/5/167/8/16

Fingerprint

Object Classification
Neural Networks
Neural networks
Brightness
Luminance
Light reflection
Space debris
Optical sensors
Classifying Space
Optical Sensor
Workspace
Training Samples
Dynamical systems
Data-driven
Classification Problems
Physics
Satellites
Dynamical system
Classify
Scenarios

ASJC Scopus subject areas

  • Statistics, Probability and Uncertainty
  • Computer Science Applications
  • Computer Vision and Pattern Recognition
  • Signal Processing

Cite this

Linares, R., & Furfaro, R. (2016). Space Object classification using deep Convolutional Neural Networks. In FUSION 2016 - 19th International Conference on Information Fusion, Proceedings (pp. 1140-1146). [7528013] Institute of Electrical and Electronics Engineers Inc..

Space Object classification using deep Convolutional Neural Networks. / Linares, Richard; Furfaro, Roberto.

FUSION 2016 - 19th International Conference on Information Fusion, Proceedings. Institute of Electrical and Electronics Engineers Inc., 2016. p. 1140-1146 7528013.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Linares, R & Furfaro, R 2016, Space Object classification using deep Convolutional Neural Networks. in FUSION 2016 - 19th International Conference on Information Fusion, Proceedings., 7528013, Institute of Electrical and Electronics Engineers Inc., pp. 1140-1146, 19th International Conference on Information Fusion, FUSION 2016, Heidelberg, Germany, 7/5/16.
Linares R, Furfaro R. Space Object classification using deep Convolutional Neural Networks. In FUSION 2016 - 19th International Conference on Information Fusion, Proceedings. Institute of Electrical and Electronics Engineers Inc. 2016. p. 1140-1146. 7528013
Linares, Richard ; Furfaro, Roberto. / Space Object classification using deep Convolutional Neural Networks. FUSION 2016 - 19th International Conference on Information Fusion, Proceedings. Institute of Electrical and Electronics Engineers Inc., 2016. pp. 1140-1146
@inproceedings{2cc66383ac4847c1a8ca645392b56a25,
title = "Space Object classification using deep Convolutional Neural Networks",
abstract = "Tracking and characterizing both active and inactive Space Objects (SOs) is required for protecting space assets. Characterizing and classifying space debris is critical to understanding the threat they may pose to active satellites and manned missions. This work examines SO classification using brightness measurements derived from electrical-optical sensors. The classification approach discussed in this work is data-driven in that it learns from data examples how to extract features and classify SOs. The classification approach is based on a deep Convolutional Neural Network (CNN) approach where a layered hierarchical architecture is used to extract features from brightness measurements. Training samples are generated from physics-based models that account for rotational dynamics and light reflection properties of SOs. The number of parameters involved in modeling SO brightness measurements make traditional estimation approaches computationally expensive. This work shows that the CNN approach can efficiently solve classification problem for this complex physical dynamical system. The performance of these strategies for SO classification is demonstrated via simulated scenarios.",
author = "Richard Linares and Roberto Furfaro",
year = "2016",
month = "8",
day = "1",
language = "English (US)",
pages = "1140--1146",
booktitle = "FUSION 2016 - 19th International Conference on Information Fusion, Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Space Object classification using deep Convolutional Neural Networks

AU - Linares, Richard

AU - Furfaro, Roberto

PY - 2016/8/1

Y1 - 2016/8/1

N2 - Tracking and characterizing both active and inactive Space Objects (SOs) is required for protecting space assets. Characterizing and classifying space debris is critical to understanding the threat they may pose to active satellites and manned missions. This work examines SO classification using brightness measurements derived from electrical-optical sensors. The classification approach discussed in this work is data-driven in that it learns from data examples how to extract features and classify SOs. The classification approach is based on a deep Convolutional Neural Network (CNN) approach where a layered hierarchical architecture is used to extract features from brightness measurements. Training samples are generated from physics-based models that account for rotational dynamics and light reflection properties of SOs. The number of parameters involved in modeling SO brightness measurements make traditional estimation approaches computationally expensive. This work shows that the CNN approach can efficiently solve classification problem for this complex physical dynamical system. The performance of these strategies for SO classification is demonstrated via simulated scenarios.

AB - Tracking and characterizing both active and inactive Space Objects (SOs) is required for protecting space assets. Characterizing and classifying space debris is critical to understanding the threat they may pose to active satellites and manned missions. This work examines SO classification using brightness measurements derived from electrical-optical sensors. The classification approach discussed in this work is data-driven in that it learns from data examples how to extract features and classify SOs. The classification approach is based on a deep Convolutional Neural Network (CNN) approach where a layered hierarchical architecture is used to extract features from brightness measurements. Training samples are generated from physics-based models that account for rotational dynamics and light reflection properties of SOs. The number of parameters involved in modeling SO brightness measurements make traditional estimation approaches computationally expensive. This work shows that the CNN approach can efficiently solve classification problem for this complex physical dynamical system. The performance of these strategies for SO classification is demonstrated via simulated scenarios.

UR - http://www.scopus.com/inward/record.url?scp=84992135133&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84992135133&partnerID=8YFLogxK

M3 - Conference contribution

SP - 1140

EP - 1146

BT - FUSION 2016 - 19th International Conference on Information Fusion, Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -