ParticleNeRF: A Particle-Based Encoding for Online Neural Radiance Fields

Date

2024

Authors

Abou-Chakra, J.
Dayoub, F.
Sunderhauf, N.

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV 2024), 2024, pp.5963-5972

Statement of Responsibility

Jad Abou-Chakra, Feras Dayoub, Niko Sünderhauf

Conference Name

IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) (3 Jan 2024 - 8 Jan 2024 : Waikoloa, HI, USA)

Abstract

While existing Neural Radiance Fields (NeRFs) for dynamic scenes are offline methods with an emphasis on visual fidelity, our paper addresses the online use case that prioritises real-time adaptability. We present ParticleNeRF, a new approach that dynamically adapts to changes in the scene geometry by learning an up-to-date representation online, every 200 ms. ParticleNeRF achieves this using a novel particle-based parametric encoding. We couple features to particles in space and backpropagate the photometric reconstruction loss into the particles’ position gradients, which are then interpreted as velocity vectors. Governed by a lightweight physics system to handle collisions, this lets the features move freely with the changing scene geometry. We demonstrate ParticleNeRF on various dynamic scenes containing translating, rotating, articulated, and deformable objects. ParticleNeRF is the first online dynamic NeRF and achieves fast adaptability with better visual fidelity than brute-force online InstantNGP and other base-line approaches on dynamic scenes with online constraints. Videos of our system can be found at the anonymous project website https://sites.google.com/view/particlenerf.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

©2024 IEEE

License

Call number

Persistent link to this record