线性专家混合模型:线性序列建模与专家混合的融合
Linear-MoE: Linear Sequence Modeling Meets Mixture-of-Experts
March 7, 2025
作者: Weigao Sun, Disen Lan, Tong Zhu, Xiaoye Qu, Yu Cheng
cs.AI
摘要
线性序列建模(Linear Sequence Modeling, LSM),如线性注意力机制、状态空间模型和线性循环神经网络(RNN),以及专家混合模型(Mixture-of-Experts, MoE)近期作为重要的架构创新崭露头角。本文中,我们介绍了Linear-MoE,一个将LSM与MoE相结合,用于大规模模型建模与训练的生产级系统。Linear-MoE充分利用了LSM模块在线性复杂度序列建模上的优势,以及MoE层在稀疏激活上的特性,旨在实现高效训练的同时提供卓越性能。Linear-MoE系统包含两大核心部分:1)建模子系统,它提供了一个统一框架,支持所有LSM实例;2)训练子系统,通过集成多种先进的并行技术,特别是为Linear-MoE模型设计的序列并行(Sequence Parallelism),以促进高效训练。此外,我们还探索了将Linear-MoE层与标准Transformer-MoE层及其序列并行技术相结合的混合模型,以进一步提升模型的灵活性和性能。通过对A0.3B-2B和A1B-7B两个模型系列的评估,Linear-MoE在保持各项基准测试中竞争力的同时,展现了显著的效率提升,证明了其作为下一代基础模型架构的潜力。代码地址:https://github.com/OpenSparseLLMs/Linear-MoE。
English
Linear Sequence Modeling (LSM) like linear attention, state space models and
linear RNNs, and Mixture-of-Experts (MoE) have recently emerged as significant
architectural improvements. In this paper, we introduce Linear-MoE, a
production-level system for modeling and training large-scale models that
integrate LSM with MoE. Linear-MoE leverages the advantages of both LSM modules
for linear-complexity sequence modeling and MoE layers for sparsely activation,
aiming to offer high performance with efficient training. The Linear-MoE system
comprises: 1) Modeling subsystem, which provides a unified framework supporting
all instances of LSM. and 2) Training subsystem, which facilitates efficient
training by incorporating various advanced parallelism technologies,
particularly Sequence Parallelism designed for Linear-MoE models. Additionally,
we explore hybrid models that combine Linear-MoE layers with standard
Transformer-MoE layers with its Sequence Parallelism to further enhance model
flexibility and performance. Evaluations on two model series, A0.3B-2B and
A1B-7B, demonstrate Linear-MoE achieves efficiency gains while maintaining
competitive performance on various benchmarks, showcasing its potential as a
next-generation foundational model architecture. Code:
https://github.com/OpenSparseLLMs/Linear-MoE.Summary
AI-Generated Summary