Post-BERT: Pre-trained Language Models and Natural Language Generation

Post-BERT: Pre-trained Language Models and Natural Language Generation

Wishing You a Prosperous Year of the Rat HAPPY 2020’S NEW YEAR Author:Tea Book Club of Lao Song Zhihu Column:NLP and Deep Learning Research Direction:Natural Language Processing Source:AINLP Introduction BERT has achieved great success in the field of natural language understanding, but it performs poorly in natural language generation due to the language model used … Read more

Natural Language Processing: An Overview of Progress and Trends

Natural Language Processing: An Overview of Progress and Trends

This article is reprinted from: TJUNLP Xiong Deyi Professor and PhD supervisor at the Natural Language Processing Laboratory of the School of Intelligence and Computing, Tianjin University; Director of the Chinese Chinese Information Society.Research Areas:Machine Translation Dialogue, Natural Language Generation, Machine Reading Comprehension and Question Answering, Information Extraction and Knowledge Graph. Natural Language Processing (NLP) … Read more