Asymmetric contextual modulation for infrared small target detection

Yimian Dai, Yiquan Wu, Fei Zhou, Kobus Barnard

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Scopus citations

Abstract

Single-frame infrared small target detection remains a challenge not only due to the scarcity of intrinsic target characteristics but also because of lacking a public dataset. In this paper, we first contribute an open dataset with high-quality annotations to advance the research in this field. We also propose an asymmetric contextual modulation module specially designed for detecting infrared small targets. To better highlight small targets, besides a top-down global contextual feedback, we supplement a bottom-up modulation pathway based on point-wise channel attention for exchanging high-level semantics and subtle low-level details. We report ablation studies and comparisons to state-of-the-art methods, where we find that our approach performs significantly better. Our dataset and code are available online 1.

Original languageEnglish (US)
Title of host publicationProceedings - 2021 IEEE Winter Conference on Applications of Computer Vision, WACV 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages949-958
Number of pages10
ISBN (Electronic)9780738142661
DOIs
StatePublished - Jan 2021
Event2021 IEEE Winter Conference on Applications of Computer Vision, WACV 2021 - Virtual, Online, United States
Duration: Jan 5 2021Jan 9 2021

Publication series

NameProceedings - 2021 IEEE Winter Conference on Applications of Computer Vision, WACV 2021

Conference

Conference2021 IEEE Winter Conference on Applications of Computer Vision, WACV 2021
Country/TerritoryUnited States
CityVirtual, Online
Period1/5/211/9/21

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Asymmetric contextual modulation for infrared small target detection'. Together they form a unique fingerprint.

Cite this