多项式组合激活函数:释放大型语言模型的动力学
Polynomial Composition Activations: Unleashing the Dynamics of Large Language Models
November 6, 2024
作者: Zhijian Zhuo, Ya Wang, Yutao Zeng, Xiaoqing Li, Xun Zhou, Jinwen Ma
cs.AI
摘要
由于其强大的拟合能力,变压器在各个领域都找到了广泛的应用。这种成功部分归因于它们固有的非线性特性。因此,除了原始变压器架构中使用的ReLU函数外,研究人员还探索了诸如GeLU和SwishGLU等替代模块,以增强非线性并从而增加表示能力。在本文中,我们提出了一种新颖的多项式组合激活函数(PolyCom),旨在优化变压器的动态特性。从理论上讲,我们对PolyCom进行了全面的数学分析,突出了相对于其他激活函数的增强表达能力和有效性。值得注意的是,我们证明了集成PolyCom的网络实现了最佳逼近速率,表明PolyCom网络需要最少的参数来逼近Sobolev空间中的一般平滑函数。我们对大型语言模型(LLMs)的预训练配置进行了实证实验,包括密集和稀疏架构。通过用PolyCom替换传统激活函数,我们使LLMs能够捕捉数据中的高阶交互作用,从而提高了准确性和收敛速度等性能指标。广泛的实验结果表明了我们方法的有效性,显示出相对于其他激活函数的显著改进。代码可在https://github.com/BryceZhuo/PolyCom找到。
English
Transformers have found extensive applications across various domains due to
the powerful fitting capabilities. This success can be partially attributed to
their inherent nonlinearity. Thus, in addition to the ReLU function employed in
the original transformer architecture, researchers have explored alternative
modules such as GeLU and SwishGLU to enhance nonlinearity and thereby augment
representational capacity. In this paper, we propose a novel category of
polynomial composition activations (PolyCom), designed to optimize the dynamics
of transformers. Theoretically, we provide a comprehensive mathematical
analysis of PolyCom, highlighting its enhanced expressivity and efficacy
relative to other activation functions. Notably, we demonstrate that networks
incorporating PolyCom achieve the optimal approximation rate,
indicating that PolyCom networks require minimal parameters to approximate
general smooth functions in Sobolev spaces. We conduct empirical experiments on
the pre-training configurations of large language models (LLMs), including both
dense and sparse architectures. By substituting conventional activation
functions with PolyCom, we enable LLMs to capture higher-order interactions
within the data, thus improving performance metrics in terms of accuracy and
convergence rates. Extensive experimental results demonstrate the effectiveness
of our method, showing substantial improvements over other activation
functions. Code is available at https://github.com/BryceZhuo/PolyCom.Summary
AI-Generated Summary