群下采样与等變抗鋸齒
Group Downsampling with Equivariant Anti-aliasing
April 24, 2025
作者: Md Ashiqur Rahman, Raymond A. Yeh
cs.AI
摘要
下采样层是卷积神经网络(CNN)架构中的关键构建模块,它们有助于扩大感受野以学习高级特征,并减少模型的内存/计算量。在本研究中,我们探讨了针对群等变架构(例如G-CNNs)的均匀下采样层的泛化问题。具体而言,我们的目标是在具有抗混叠特性的情况下,对一般有限群上的信号(特征图)进行下采样。这包括以下内容:(a) 给定一个有限群和一个下采样率,我们提出了一种算法来形成合适的子群选择。(b) 给定一个群及其子群,我们研究了带限性的概念,并提出了如何进行抗混叠的方法。值得注意的是,我们的方法基于经典采样理论,推广了下采样的概念。当信号位于循环群(即周期性)上时,我们的方法恢复了理想低通滤波器后接子采样操作的标准下采样过程。最后,我们在图像分类任务上进行了实验,结果表明,所提出的下采样操作在融入G等变网络时,提高了准确率,更好地保持了等变性,并减小了模型规模。
English
Downsampling layers are crucial building blocks in CNN architectures, which
help to increase the receptive field for learning high-level features and
reduce the amount of memory/computation in the model. In this work, we study
the generalization of the uniform downsampling layer for group equivariant
architectures, e.g., G-CNNs. That is, we aim to downsample signals (feature
maps) on general finite groups with anti-aliasing. This involves the following:
(a) Given a finite group and a downsampling rate, we present an algorithm to
form a suitable choice of subgroup. (b) Given a group and a subgroup, we study
the notion of bandlimited-ness and propose how to perform anti-aliasing.
Notably, our method generalizes the notion of downsampling based on classical
sampling theory. When the signal is on a cyclic group, i.e., periodic, our
method recovers the standard downsampling of an ideal low-pass filter followed
by a subsampling operation. Finally, we conducted experiments on image
classification tasks demonstrating that the proposed downsampling operation
improves accuracy, better preserves equivariance, and reduces model size when
incorporated into G-equivariant networksSummary
AI-Generated Summary