ProTracker:用于稳健和准确的点跟踪的概率集成
ProTracker: Probabilistic Integration for Robust and Accurate Point Tracking
January 6, 2025
作者: Tingyang Zhang, Chen Wang, Zhiyang Dou, Qingzhe Gao, Jiahui Lei, Baoquan Chen, Lingjie Liu
cs.AI
摘要
本文提出了一种名为ProTracker的新型框架,用于在视频中对任意点进行稳健准确的长期密集跟踪。我们方法的关键思想是将概率积分纳入其中,以改进光流和语义特征的多重预测,从而实现稳健的短期和长期跟踪。具体而言,我们以概率方式整合光流估计,通过最大化每个预测的可能性,生成平滑准确的轨迹。为了有效地重新定位由于遮挡而消失和重新出现的具有挑战性的点,我们进一步将长期特征对应性纳入我们的光流预测中,以实现连续轨迹生成。大量实验证明,ProTracker在无监督和自监督方法中实现了最先进的性能,并且在几个基准测试中甚至优于监督方法。我们的代码和模型将在发表后公开发布。
English
In this paper, we propose ProTracker, a novel framework for robust and
accurate long-term dense tracking of arbitrary points in videos. The key idea
of our method is incorporating probabilistic integration to refine multiple
predictions from both optical flow and semantic features for robust short-term
and long-term tracking. Specifically, we integrate optical flow estimations in
a probabilistic manner, producing smooth and accurate trajectories by
maximizing the likelihood of each prediction. To effectively re-localize
challenging points that disappear and reappear due to occlusion, we further
incorporate long-term feature correspondence into our flow predictions for
continuous trajectory generation. Extensive experiments show that ProTracker
achieves the state-of-the-art performance among unsupervised and
self-supervised approaches, and even outperforms supervised methods on several
benchmarks. Our code and model will be publicly available upon publication.Summary
AI-Generated Summary