Attention-based network for low-light image enhancement

Date

2020

Authors

Zhang, C.
Yan, Q.
Zhu, Y.
Li, X.
Sun, J.
Zhang, Y.

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

Proceedings / IEEE International Conference on Multimedia and Expo. IEEE International Conference on Multimedia and Expo, 2020, vol.2020-July, pp.1-6

Statement of Responsibility

Cheng Zhang, Qingsen Yan, Yu Zhu, Xianjun Li, Jinqiu Sun, Yanning Zhang

Conference Name

IEEE International Conference on Multimedia and Expo (ICME) (6 Jul 2020 - 10 Jul 2020 : virtual online)

Abstract

The captured images under low-light conditions often suffer insufficient brightness and notorious noise. Hence, low-light image enhancement is a key challenging task in computer vision. A variety of methods have been proposed for this task, but these methods often failed in an extreme low-light environment and amplified the underlying noise in the input image. To address such a difficult problem, this paper presents a novel attention-based neural network to generate high-quality enhanced low-light images from the raw sensor data. Specifically, we first employ attention strategy (i.e. spatial attention and channel attention modules) to suppress undesired chromatic aberration and noise. The spatial attention module focuses on denoising by taking advantage of the non-local correlation in the image. The channel attention module guides the network to refine redundant colour features. Furthermore, we propose a new pooling layer, called inverted shuffle layer, which adaptively selects useful information from previous features. Extensive experiments demonstrate the superiority of the proposed network in terms of suppressing the chromatic aberration and noise artifacts in enhancement, especially when the low-light image has severe noise.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

© 2020 IEEE.

License

Call number

Persistent link to this record