ChatPaper.aiChatPaper

《Trillion 7B 技术报告》

Trillion 7B Technical Report

April 21, 2025
作者: Sungjun Han, Juyoung Suk, Suyeong An, Hyungguk Kim, Kyuseok Kim, Wonsuk Yang, Seungtaek Choi, Jamin Shin
cs.AI

摘要

我们推出了Trillion-7B,这是一款最具令牌效率的以韩语为核心的多语言大模型。我们创新的跨语言文档注意力机制(XLDA)实现了从英语到韩语、日语等目标语言的高效知识迁移。结合优化的数据混合策略、语言特定过滤及定制化的分词器构建,Trillion-7B在仅分配其2万亿训练令牌中的10%用于多语言数据,并仅需59.4K H100 GPU小时(约14.8万美元)完成全面训练的情况下,仍展现出卓越性能。在涵盖四种语言的27项基准测试中,Trillion-7B展现了其强大的多语言处理能力及出色的跨语言一致性。
English
We introduce Trillion-7B, the most token-efficient Korean-centric multilingual LLM available. Our novel Cross-lingual Document Attention (XLDA) mechanism enables highly efficient and effective knowledge transfer from English to target languages like Korean and Japanese. Combined with optimized data mixtures, language-specific filtering, and tailored tokenizer construction, Trillion-7B achieves competitive performance while dedicating only 10\% of its 2T training tokens to multilingual data and requiring just 59.4K H100 GPU hours (\$148K) for full training. Comprehensive evaluations across 27 benchmarks in four languages demonstrate Trillion-7B's robust multilingual performance and exceptional cross-lingual consistency.

Summary

AI-Generated Summary

PDF241April 24, 2025