PhiloBERTA:基于Transformer的希腊语与拉丁语词典跨语言分析
PhiloBERTA: A Transformer-Based Cross-Lingual Analysis of Greek and Latin Lexicons
March 7, 2025
作者: Rumi A. Allbert, Makai L. Allbert
cs.AI
摘要
我们推出PhiloBERTA,一个跨语言的Transformer模型,用于衡量古希腊语与拉丁语词汇间的语义关联。通过对古典文本中精选术语对的分析,我们运用上下文嵌入及角度相似度度量,精准识别语义对应关系。研究结果显示,词源相关的术语对展现出显著更高的相似度得分,尤其是在抽象哲学概念如epistēmē(scientia,知识)与dikaiosynē(iustitia,正义)方面。统计分析揭示了这些关系中的一致性模式(p=0.012),相较于对照组,词源相关对在语义保持上表现出异常稳定的特性。这些发现为探究哲学概念如何在希腊与拉丁传统间迁移建立了量化框架,为古典语文学研究提供了新方法。
English
We present PhiloBERTA, a cross-lingual transformer model that measures
semantic relationships between ancient Greek and Latin lexicons. Through
analysis of selected term pairs from classical texts, we use contextual
embeddings and angular similarity metrics to identify precise semantic
alignments. Our results show that etymologically related pairs demonstrate
significantly higher similarity scores, particularly for abstract philosophical
concepts such as epist\=em\=e (scientia) and dikaiosyn\=e (iustitia).
Statistical analysis reveals consistent patterns in these relationships (p =
0.012), with etymologically related pairs showing remarkably stable semantic
preservation compared to control pairs. These findings establish a quantitative
framework for examining how philosophical concepts moved between Greek and
Latin traditions, offering new methods for classical philological research.Summary
AI-Generated Summary