Automatic remote-sensing images registration by matching close-regions

Date

2003

Authors

Xie, G.
Shen, H.

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

Parallel and distributed processing and applications : international symposium, ISPA 2003, Aizu-Wakamatsu, Japan, July 2-4, 2003 : proceedings / Minyi Guo, Laurence Tianruo Yang (eds.), pp. 316-328

Statement of Responsibility

Gui Xie, Hong Shen

Conference Name

ISPA 2003 (2003 : Aizuwakamatsu-shi, Japan)

Abstract

Remote-sensing images registration is a fundamental task in image processing, which is concerned with establishment of correspondence between two or more pictures taken, for example, at different times, from different sensors, or from different viewpoints. Because of the different gray level characters in such remote-sensing images, it's difficult to match them automatically. We usually constrain the images to some particular categories, or do the job manually. In this paper, we develop a new algorithm for remote-sensing images registration, which takes full advantage of the shape information of the close-regions bounded by contours after detecting and linking the edges in images. Based on the shape-specific points of the close-regions, we match the close-regions by evaluating their matching degrees. Using the matched pairs of the close-regions, the geometric parameters for images registration are computed and this registration task can be performed automatically and accurately. This new algorithm works well for those images where the contour information is well preserved, such as the optical images from LANDSAT and SPOT satellites. Experiments verified our algorithm, and showed that the performance of executing it sequentially depends a lot on the size of the input images. The time complexity will increase exponentially as the size of images increases. So we extend the sequential algorithm to a distributed scheme and perform the registration task more efficiently.

School/Discipline

Dissertation Note

Provenance

Description

The original publication is available at www.springerlink.com

Access Status

Rights

License

Grant ID

Call number

Persistent link to this record