Pulse-coupled neural network performance for real-time identification of vegetation during forced landing


Autoria(s): Warne, David James; Hayward, Ross; Kelson, Neil; Banks, Jasmine; Mejias, Luis
Data(s)

21/03/2014

Resumo

Safety concerns in the operation of autonomous aerial systems require safe-landing protocols be followed during situations where the mission should be aborted due to mechanical or other failure. This article presents a pulse-coupled neural network (PCNN) to assist in the vegetation classification in a vision-based landing site detection system for an unmanned aircraft. We propose a heterogeneous computing architecture and an OpenCL implementation of a PCNN feature generator. Its performance is compared across OpenCL kernels designed for CPU, GPU, and FPGA platforms. This comparison examines the compute times required for network convergence under a variety of images to determine the plausibility for real-time feature detection.

Identificador

http://eprints.qut.edu.au/69084/

Publicador

Cambridge University Press

Relação

http://journal.austms.org.au/ojs/index.php/ANZIAMJ/article/view/7851

Warne, David James , Hayward, Ross , Kelson, Neil , Banks, Jasmine, & Mejias, Luis (2014) Pulse-coupled neural network performance for real-time identification of vegetation during forced landing. ANZIAM Journal, 55, c1-c16.

Fonte

Australian Research Centre for Aerospace Automation; Division of Technology, Information and Learning Support; School of Electrical Engineering & Computer Science; High Performance Computing and Research Support; Science & Engineering Faculty

Palavras-Chave #080106 Image Processing #090105 Avionics #Unmanned Aerial Vehicle #Emergency Landing #Pulse Coupled Neural Network #Feature Classification #Field Programmable Gate Array #OpenCL
Tipo

Journal Article