From AlexNet to BERT: A Simple Review of Key Ideas in Deep Learning

From AlexNet to BERT: A Simple Review of Key Ideas in Deep Learning

Follow the WeChat public account “ML_NLP“ Set as “Starred“, heavy content delivered at the first time! Source | Big Data Digest Translation | Ao🌰viYa, Meat Bun, Andy This article by Denny Britz summarizes the important ideas in deep learning over time, recommended for newcomers, listing almost all the key ideas since 2012 that have supported … Read more

ALBERT: A Lightweight BERT for Self-Supervised Learning in Language Representation

ALBERT: A Lightweight BERT for Self-Supervised Learning in Language Representation

Follow us on WeChat “ML_NLP” Set as “Starred”, heavy content delivered first hand! Written by / Radu Soricut and Zhenzhong Lan, Researchers, Google Research Since the advent of BERT a year ago, natural language research has adopted a new paradigm: leveraging a large amount of existing text to pre-train model parameters in a self-supervised manner … Read more

Have You Read the Bert Source Code?

Have You Read the Bert Source Code?

Click the “MLNLP” above and select “Star” to follow the public account Heavyweight content delivered to you first Author:Old Song’s Tea Book Club Zhihu Column:NLP and Deep Learning Research Direction:Natural Language Processing Introduction A few days ago, during an interview, an interviewer directly asked me to analyze the source code of BERT. Emm, that was … Read more

Roberta: Fine-Tuning Bert

Roberta: Fine-Tuning Bert

Follow the WeChat public account “ML_NLP“ Set as “Starred“, heavy content delivered first hand! Recently, I need to start using Transformers for some tasks, so I specifically recorded related knowledge points to build a relevant and complete knowledge structure system, The following are the articles to be written, this article is the fourth in this … Read more

A Detailed Explanation from Transformer to BERT Model

A Detailed Explanation from Transformer to BERT Model

Follow the WeChat public account “ML_NLP“ Set as “Starred“, heavy content delivered first-hand! Table of Contents: A Brief Review of ELMo and Transformer DAE and Masked Language Model Detailed Explanation of BERT Model Different Training Methods of BERT Model How to Apply BERT Model in Real Projects How to Slim Down BERT Problems with BERT … Read more

Master Bert Source Code in 10 Minutes (PyTorch Version)

Master Bert Source Code in 10 Minutes (PyTorch Version)

The application of Bert in production environments requires compression, which demands a deep understanding of the Bert structure. This repository will interpret the Bert source code (PyTorch version) step by step. The repository can be found at https://github.com/DA-southampton/NLP_ability Code and Data Introduction First, for the code, I referenced this repository. I directly cloned the code … Read more

Post-BERT: Pre-trained Language Models and Natural Language Generation

Post-BERT: Pre-trained Language Models and Natural Language Generation

Wishing You a Prosperous Year of the Rat HAPPY 2020’S NEW YEAR Author:Tea Book Club of Lao Song Zhihu Column:NLP and Deep Learning Research Direction:Natural Language Processing Source:AINLP Introduction BERT has achieved great success in the field of natural language understanding, but it performs poorly in natural language generation due to the language model used … Read more

ALBERT: A Lightweight BERT That Is Both Light and Effective

ALBERT: A Lightweight BERT That Is Both Light and Effective

Follow our WeChat public account “ML_NLP“ Set as “Starred“, delivering heavy content to you first! Today, we are reading the 2019 paper by Google titled “ALBERT: A Lite BERT for Self-Supervised Learning of Language Representations”. We know that the model’s performance improves with increased depth, but deeper models also make training more difficult. To address … Read more

In-Depth Analysis of BERT Source Code

In-Depth Analysis of BERT Source Code

By/Gao Kaiyuan Image Source: Internet Introduction [email protected] I have been reviewing materials related to Paddle, so I decided to take a closer look at the source code of Baidu’s ERNIE. When I skimmed through it before, I noticed that ERNIE 2.0 and ERNIE-tiny are quite similar to BERT. I wonder what changes have been made … Read more

Decoding BERT: Understanding Its Impact on NLP

Decoding BERT: Understanding Its Impact on NLP

Click on the “MLNLP” above and select the “Star” public account Heavyweight content delivered to you in real-time Reprinted from the public account: AI Developer Introduction to BERT It is no exaggeration to say that Google’s AI Lab’s BERT has profoundly impacted the landscape of NLP. Imagine a model trained on a vast amount of … Read more