Image fusion for uninhabited airborne vehicles
Files
(Published version)
Date
2002
Authors
Jasiunas, M.D.
Kearney, D.A.
Hopf, J.P.
Wigley, G.B.
Editors
Advisors
Journal Title
Journal ISSN
Volume Title
Type:
Conference paper
Citation
2002 IEEE International Conference on Field-Programmable Technology (FPT), 2002, pp.348-351
Statement of Responsibility
Conference Name
2002 IEEE International Conference on Field-Programmable Technology (FPT) (16 Dec 2002 - 18 Dec 2002 : Hong Kong, China)
Abstract
In image fusion, information from a set of images is extracted and then combined intelligently to form a new composite image with extended information content. The original data may come from different viewing conditions (bracketed focus or exposure) or various sensors (visible and infrared or a cat scan and magnetic resonance imagery). Uninhabited Airborne Vehicles (UAVs) often have visible, infrared and synthetic aperture radar imaging sensors, so image fusion is an appropriate onboard processing task for UAVs. Some forms of image fusion are computationally intensive tasks, but like many other image processing applications are naturally suited to acceleration in hardware. This potential for hardware acceleration, and the ability to reconfigure the UAV to implement new algorithms as it moves towards objects of interest make reconfigurable computing a natural route for a hardware implementation. In this paper we present what we believe is the first implementation of image fusion on a reconfigurable platform alone, and the first investigation of adaptive image fusion which makes use of dynamic reconfiguration to change the fusion algorithm as the UAV approaches an object of interest.
School/Discipline
Dissertation Note
Provenance
Description
Access Status
Rights
Copyright IEEE 2002