Back to blog

Why does your video of moving objects always look broken

Based on research by Jae Won Jang, Yeonjin Chang, Wonsik Shin, Juhwan Cho, Nojun Kwak

Existing technology struggles to rebuild smooth geometry for dynamic scenes because older systems get stuck trying to fit data to flat surfaces, which causes major distortions when parts of an object are hidden or out of view. A fresh approach called 4DGS360 fixes this by using a specialized 3D tracker that anchors unstable points in space with reliable data from visible areas, effectively stopping the visual drift that usually ruins reconstruction quality. The team tested their breakthrough on a new benchmark where test cameras were placed up to 135 degrees apart from training views, simulating extreme viewing angles that previously caused existing tools to fail completely. Results show this method outperforms current state-of-the-art models across major datasets while remaining faster and more accurate at handling occlusions than any previous system could manage.

4DGS360: 360° Gaussian Reconstruction of Dynamic Objects from a Single Video, Jae Won Jang et al., https://arxiv.org/abs/2603.21618

Source: arXiv:2603.21618

This post was generated by staik AI based on the academic publication above.