Nine Research Hotspots in Natural Language Processing

Nine Research Hotspots in Natural Language Processing

On the morning of May 23, 2020, at the “ACL-IJCAI-SIGIR Top Conference Paper Reporting Meeting (AIS 2020)” organized by the Youth Working Committee of the Chinese Chinese Information Society, and hosted by the Beijing Zhiyuan Artificial Intelligence Research Institute and Meituan-Dianping, Jia Jia, a young scientist from Zhiyuan, doctoral supervisor at the Department of Computer … Read more

Natural Language Processing: An Overview of Progress and Trends

Natural Language Processing: An Overview of Progress and Trends

This article is reprinted from: TJUNLP Xiong Deyi Professor and PhD supervisor at the Natural Language Processing Laboratory of the School of Intelligence and Computing, Tianjin University; Director of the Chinese Chinese Information Society.Research Areas:Machine Translation Dialogue, Natural Language Generation, Machine Reading Comprehension and Question Answering, Information Extraction and Knowledge Graph. Natural Language Processing (NLP) … Read more

An Overview of Natural Language Processing by Sun Maosong

An Overview of Natural Language Processing by Sun Maosong

This article is reprinted from: Language Monitoring and Intelligent Learning Written by / Sun Maosong The importance of human language (i.e., natural language) cannot be overstated. Edward O. Wilson, the father of sociobiology, once said, “Language is the greatest evolutionary achievement after eukaryotic cells.” James Gleick, the author of the popular science bestseller “The Information: … Read more

Introduction to Natural Language Processing

Introduction to Natural Language Processing

Introduction Natural Language Processing is a subfield of computer science, information engineering, and artificial intelligence, which involves the interaction between computers and human languages, processing and analyzing large amounts of natural language data through programming. 1Natural Language Processing(NLP) = Computer Science + AI + Computational Linguistics In other words, natural language processing is the ability … Read more

Understanding Natural Language Processing

Understanding Natural Language Processing

This article was first published on WeChat public account “Intelligent Cube“ Author Introduction: Liu Zhiyuan, Assistant Researcher at Tsinghua University, Senior Member of the China Computer Federation. He obtained his PhD from Tsinghua University in 2011 and won the Excellent Doctoral Dissertation Award from the Chinese Association for Artificial Intelligence. His main research areas are … Read more

Comprehensive Guide to Seq2Seq Attention Model

Comprehensive Guide to Seq2Seq Attention Model

Follow us on WeChat: ML_NLP. Set as a “Starred” account for heavy content delivered to you first! Source: | Zhihu Link: | https://zhuanlan.zhihu.com/p/40920384 Author: | Yuanche.Sh Editor: | Machine Learning Algorithms and Natural Language Processing WeChat account This article is for academic sharing only. If there is any infringement, please contact us to delete it. … Read more

Implementing EncoderDecoder + Attention with PaddlePaddle

Implementing EncoderDecoder + Attention with PaddlePaddle

Author丨Fat Cat, Yi Zhen Zhihu Column丨Machine Learning Algorithms and Natural Language Processing Address丨https://zhuanlan.zhihu.com/p/82477941 Natural Language Processing (NLP) is generally divided into two categories: Natural Language Understanding (NLU) and Natural Language Generation (NLG). The former extracts or analyzes concise logical information from a piece of text, such as Named Entity Recognition (NER) which identifies keywords in … Read more

Latest Overview of Attention Mechanism Models

Latest Overview of Attention Mechanism Models

Source:Zhuanzhi This article is a multi-resource, recommended reading in 5 minutes. This article details the Attention model‘s concept, definition, impact, and how to get started with practical work. [Introduction]The Attention model has become an important concept in neural networks, and this article brings you the latest overview of this model, detailing its concept, definition, impact, … Read more

Understanding Attention Mechanism in Language Translation

Understanding Attention Mechanism in Language Translation

Author丨Tianyu Su Zhihu Column丨Machines Don’t Learn Address丨https://zhuanlan.zhihu.com/p/27769286 In the previous column, we implemented a basic version of the Seq2Seq model. This model performs sorting of letters, taking an input sequence of letters and returning the sorted sequence. Through the implementation in the last article, we have gained an understanding of the Seq2Seq model, which mainly … Read more

Understanding Transformer Models: A Comprehensive Guide

Understanding Transformer Models: A Comprehensive Guide

Author: Chen Zhi Yan This article is approximately 3500 words long and is recommended for a 7-minute read. The Transformer is the first model that completely relies on the self-attention mechanism to compute its input and output representations. The mainstream sequence-to-sequence models are based on encoder-decoder recurrent or convolutional neural networks. The introduction of the … Read more