Industry Summary | BERT’s Various Applications

Industry Summary | BERT's Various Applications

MLNLP(Machine Learning Algorithms and Natural Language Processing) community is one of the largest natural language processing communities in China and abroad, gathering over 500,000 subscribers, with an audience covering NLP master’s and doctoral students, university teachers, and corporate researchers. The vision of the community is to promote communication and progress between the academic and industrial … Read more

NLP Pre-training Models in the Post-BERT Era

NLP Pre-training Models in the Post-BERT Era

This article introduces several papers that improve the pretraining process of BERT, including Pre-Training with Whole Word Masking for Chinese BERT, ERNIE: Enhanced Representation through Knowledge Integration, and ERNIE 2.0: A Continual Pre-training Framework for Language Understanding. Note: These papers all implement different improvements to the masking of BERT’s pretraining phase, but do not modify … Read more

Comparative Evaluation of BERT and ERNIE in NLP

Comparative Evaluation of BERT and ERNIE in NLP

Yunzhong from Aofeisi Quantum Bit Report | WeChat Official Account QbitAI How do BERT and ERNIE, the two most关注的 models in the NLP field, perform? Recently, someone conducted a comparison test, and the results were surprising and delightful in a Chinese language environment. What are the specific details? Let’s take a look at this technical … Read more