XLM: Cross-Language Bert

XLM: Cross-Language Bert

Follow the public account “ML_NLP“ Set as “Starred“, heavy-duty content delivered to you first! Motivation Bert has been widely used in pre-trained language models, but current research mainly focuses on English. In cross-language scenarios, can Bert training bring improvements? The answer is almost certainly yes, it just depends on how to do it. The paper … Read more

What To Do When Word2Vec Lacks Words?

What To Do When Word2Vec Lacks Words?

Click on the “MLNLP” above and select “Star” to follow the public account Heavyweight content delivered to you first Editor: Yi Zhen https://www.zhihu.com/question/329708785 This article is for academic exchange and sharing only; if there is infringement, the article will be deleted. The author found an interesting question on Zhihu: What to do when Word2Vec lacks … Read more