Please use this identifier to cite or link to this item: http://hdl.handle.net/2440/108535
Citations
Scopus Web of Science® Altmetric
?
?
Type: Conference paper
Title: Hyperspectral Compressive Sensing Using Manifold-Structured Sparsity Prior
Author: Zhang, L.
Wei, W.
Zhang, Y.
Li, F.
Shen, C.
Shi, Q.
Citation: Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2015 / vol.2015 International Conference on Computer Vision, ICCV 2015, pp.3550-3558
Publisher: IEEE
Issue Date: 2015
Series/Report no.: IEEE International Conference on Computer Vision
ISBN: 9781467383912
ISSN: 1550-5499
Conference Name: IEEE International Conference on Computer Vision (11 Dec 2015 - 18 Dec 2015 : Santiago, Chile)
Statement of
Responsibility: 
Lei Zhang, Wei Wei, Yanning Zhang, Fei Li, Chunhua Shen, Qinfeng Shi
Abstract: To reconstruct hyperspectral image (HSI) accurately from a few noisy compressive measurements, we present a novel manifold-structured sparsity prior based hyperspectral compressive sensing (HCS) method in this study. A matrix based hierarchical prior is first proposed to represent the spectral structured sparsity and spatial unknown manifold structure of HSI simultaneously. Then, a latent variable Bayes model is introduced to learn the sparsity prior and estimate the noise jointly from measurements. The learned prior can fully represent the inherent 3D structure of HSI and regulate its shape based on the estimated noise level. Thus, with this learned prior, the proposed method improves the reconstruction accuracy significantly and shows strong robustness to unknown noise in HCS. Experiments on four real hyperspectral datasets show that the proposed method outperforms several state-of-the-art methods on the reconstruction accuracy of HSI.
Rights: © 2015 IEEE
RMID: 0030054788
DOI: 10.1109/ICCV.2015.405
Appears in Collections:Computer Science publications

Files in This Item:
File Description SizeFormat 
RA_hdl_108535.pdfRestricted Access555.66 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.