Training Noise Token Pruning
November 27, 2024
Authors: Mingxing Rao, Bohan Jiang, Daniel Moyer
cs.AI
Abstract
In the present work we present Training Noise Token (TNT) Pruning for vision transformers. Our method relaxes the discrete token dropping condition to continuous additive noise, providing smooth optimization in training, while retaining discrete dropping computational gains in deployment settings. We provide theoretical connections to Rate-Distortion literature, and empirical evaluations on the ImageNet dataset using ViT and DeiT architectures demonstrating TNT's advantages over previous pruning methods.
Summary
AI-Generated Summary