ALBERT: A Lightweight BERT That Is Both Light and Effective

ALBERT: A Lightweight BERT That Is Both Light and Effective

Follow our WeChat public account “ML_NLP“ Set as “Starred“, delivering heavy content to you first! Today, we are reading the 2019 paper by Google titled “ALBERT: A Lite BERT for Self-Supervised Learning of Language Representations”. We know that the model’s performance improves with increased depth, but deeper models also make training more difficult. To address … Read more

XLM: Cross-Language Bert

XLM: Cross-Language Bert

Follow the public account “ML_NLP“ Set as “Starred“, heavy-duty content delivered to you first! Motivation Bert has been widely used in pre-trained language models, but current research mainly focuses on English. In cross-language scenarios, can Bert training bring improvements? The answer is almost certainly yes, it just depends on how to do it. The paper … Read more

Build a Q&A Search Engine with Bert in 3 Minutes

Build a Q&A Search Engine with Bert in 3 Minutes

The renowned Bert algorithm is something most of you have probably heard of. It is a “game-changing” pre-trained model in the field of NLP launched by Google, which has set multiple records in NLP tasks and achieved state-of-the-art results. However, many deep learning beginners find that the BERT model is not easy to set up, … Read more

In-Depth Analysis of BERT Source Code

In-Depth Analysis of BERT Source Code

By/Gao Kaiyuan Image Source: Internet Introduction [email protected] I have been reviewing materials related to Paddle, so I decided to take a closer look at the source code of Baidu’s ERNIE. When I skimmed through it before, I noticed that ERNIE 2.0 and ERNIE-tiny are quite similar to BERT. I wonder what changes have been made … Read more

Is BERT’s LayerNorm Different From What You Think?

Is BERT's LayerNorm Different From What You Think?

MLNLP(Machine Learning Algorithms and Natural Language Processing) is a well-known natural language processing community both domestically and internationally, covering NLP master’s and doctoral students, university professors, and corporate researchers. The vision of the community is to promote communication and progress between the academic and industrial sectors of natural language processing and machine learning, especially for … Read more

Beginner’s Guide to BERT: From Theory to Practice

Beginner's Guide to BERT: From Theory to Practice

Click the “MLNLP” above and select the “Starred” public account Heavyweight content delivered first Author: Jay Alammar, Translated by Qbit AI BERT, as a key player in the field of natural language processing, is something that NLPer can’t avoid. However, for those with little experience and weak foundations, mastering BERT can be a bit challenging. … Read more

Decoding BERT: Understanding Its Impact on NLP

Decoding BERT: Understanding Its Impact on NLP

Click on the “MLNLP” above and select the “Star” public account Heavyweight content delivered to you in real-time Reprinted from the public account: AI Developer Introduction to BERT It is no exaggeration to say that Google’s AI Lab’s BERT has profoundly impacted the landscape of NLP. Imagine a model trained on a vast amount of … Read more

From BERT to ALBERT: A Comprehensive Overview

From BERT to ALBERT: A Comprehensive Overview

Follow the WeChat public account “ML_NLP“ Set as “Starred“, heavy content delivered first time! ALBERT Recommendation: Although the BERT model itself is very effective, this effectiveness relies on a large number of model parameters, so the time and resource costs required to train a BERT model are very high, and such a complex model can … Read more

Training CT-BERT Model on COVID-19 Data from Twitter

Training CT-BERT Model on COVID-19 Data from Twitter

Author: Chen Zhiyan This article is about 3000 words, recommended reading time is 7 minutes. This article introduces the use of the BERT model to automatically classify, filter, and summarize a large amount of COVID-19 information on Twitter. Twitter has always been an important source of news, and during the COVID-19 pandemic, the public can … Read more

How to Quickly Use BERT?

How to Quickly Use BERT?

Follow the public account “ML_NLP“ Set as “Starred“, important content delivered first! Source | Zhihu Address | https://zhuanlan.zhihu.com/p/112235454 Author | TotoroWang Editor | Machine Learning Algorithms and Natural Language Processing Public Account This article has been authorized by the author and is prohibited from secondary reproduction Introduction Since I have been working on BERT models … Read more