XLM: Cross-Language Bert

XLM: Cross-Language Bert

Follow the public account “ML_NLP“ Set as “Starred“, heavy-duty content delivered to you first! Motivation Bert has been widely used in pre-trained language models, but current research mainly focuses on English. In cross-language scenarios, can Bert training bring improvements? The answer is almost certainly yes, it just depends on how to do it. The paper … Read more