Accurate tensor completion via adaptive low-rank representation
dc.contributor.author | Zhang, L. | |
dc.contributor.author | Wei, W. | |
dc.contributor.author | Shi, Q. | |
dc.contributor.author | Shen, C. | |
dc.contributor.author | van den Hengel, A. | |
dc.contributor.author | Zhang, Y. | |
dc.date.issued | 2020 | |
dc.description | Date of publication December 30, 2019; date of current version October 6, 2020 | |
dc.description.abstract | Low-rank representation-based approaches that assume low-rank tensors and exploit their low-rank structure with appropriate prior models have underpinned much of the recent progress in tensor completion. However, real tensor data only approximately comply with the low-rank requirement in most cases, viz., the tensor consists of low-rank (e.g., principle part) as well as non-low-rank (e.g., details) structures, which limit the completion accuracy of these approaches. To address this problem, we propose an adaptive low-rank representation model for tensor completion that represents low-rank and non-low-rank structures of a latent tensor separately in a Bayesian framework. Specifically, we reformulate the CANDECOMP/PARAFAC (CP) tensor rank and develop a sparsity-induced prior for the low-rank structure that can be used to determine tensor rank automatically. Then, the non-low-rank structure is modeled using a mixture of Gaussians prior that is shown to be sufficiently flexible and powerful to inform the completion process for a variety of real tensor data. With these two priors, we develop a Bayesian minimum mean-squared error estimate framework for inference. The developed framework can capture the important distinctions between low-rank and non-low-rank structures, thereby enabling more accurate model, and ultimately, completion. For various applications, compared with the state-of-the-art methods, the proposed model yields more accurate completion results. | |
dc.description.statementofresponsibility | Lei Zhang, Wei Wei, Qinfeng Shi, Chunhua Shen, Anton van den Hengel, and Yanning Zhang | |
dc.identifier.citation | IEEE Transactions on Neural Networks and Learning Systems, 2020; 31(1):4170-4184 | |
dc.identifier.doi | 10.1109/tnnls.2019.2952427 | |
dc.identifier.issn | 2162-237X | |
dc.identifier.issn | 2162-2388 | |
dc.identifier.orcid | Shi, Q. [0000-0002-9126-2107] | |
dc.identifier.orcid | van den Hengel, A. [0000-0003-3027-8364] | |
dc.identifier.uri | http://hdl.handle.net/2440/130272 | |
dc.language.iso | en | |
dc.publisher | Institute of Electrical and Electronics Engineers | |
dc.rights | © 2019 IEEE | |
dc.source.uri | https://doi.org/10.1109/tnnls.2019.2952427 | |
dc.subject | Adaptive low-rank representation; automatic tensor rank determination; tensor completion | |
dc.title | Accurate tensor completion via adaptive low-rank representation | |
dc.type | Journal article | |
pubs.publication-status | Published |
Files
Original bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- hdl_130272.pdf
- Size:
- 7.57 MB
- Format:
- Adobe Portable Document Format
- Description:
- Accepted version