Neural Polynomial Gabor Fields for Macro Motion Analysis
ICLR 2024

(* equal contribution)

Inferring a low-dimensional interpretable representation for macro motion in a dynamic scene.

Abstract

We study macro motion analysis, where macro motion refers to the collection of all visually observable motions in a dynamic scene. Traditional filtering-based methods on motion analysis typically focus only on local and tiny motions, yet fail to represent large motions or 3D scenes. Recent dynamic neural representations can faithfully represent motions using correspondences, but they cannot be directly used for motion analysis. In this work, we propose Phase-based neural polynomial Gabor fields (Phase-PGF), which learns to represent scene dynamics with low-dimensional time-varying phases. We theoretically show that Phase-PGF has several properties suitable for macro motion analysis. In our experiments, we collect diverse 2D and 3D dynamic scenes and show that Phase-PGF enables dynamic scene analysis and editing tasks including motion loop detection, motion factorization, motion smoothing, and motion magnification.

Method

A phase generator yields time-varying phases to modulate the Gabor basis in Phase-PGF, which produces a latent feature field for volume rendering. Please refer to the text in Sec. 3.3 of the paper for more descriptions.


Motion Analysis


Jelly

Input Video
Subband Discovered
Phase Reconstructed

Bouncing

Input Video
Subband Discovered
Phase Reconstructed

Collision

Input Video
Ground Truth Motion
Phase Reconstructed
(Note that our reconstructed phase aligns with groundtruth motion, although both are unnormalized in magnitude)

3 Balls

Input Video
Ground Truth Motion
Phase Reconstructed
(Note that our reconstructed phase aligns with groundtruth motion, although both are unnormalized in magnitude)

Projectile

Input Video
Ground Truth Motion
Phase Reconstructed
(Note that our reconstructed phase aligns with groundtruth motion, although both are unnormalized in magnitude)

Damping

Input Video
Ground Truth Motion
Phase Reconstructed
(Note that our reconstructed phase aligns with groundtruth motion, although both are unnormalized in magnitude)

Qualitative Comparison


Ours Phase-NIVR
Ours Phase-NIVR
Ours Phase-NIVR
Ours Phase-NIVR
Ours Phase-NIVR

Comparison to Dense Tracking Baseline

Dense Point Tracking from Baseline [Zheng et al. 2023]

Phase Generated by the Proposed Method


Motion Magnification

Ours
Phase-NIVR
Wadhwa et al. 2013

Comparison of the result of motion intensity adjustment between different baselines and the proposed method. Top: Input Video, Bottom: Modified Video.

Motion Manipulation


Input Video
Motion Smoothing

Input Video
Motion Intensity Adjustment

Input Video
Motion Separation and Adjustment

Failure Cases

Input Video
Subband
Phase
Input Video
Subband
Phase

Citation

The website template was borrowed from Michaël Gharbi and Jon Barron.