How to Use BERT and GPT-2 in Your Models

How to Use BERT and GPT-2 in Your Models

Recommended by New Intelligence Source: Zhuanzhi (ID: Quan_Zhuanzhi) Editor: Sanshi [New Intelligence Guide] In the field of NLP, various advanced tools have emerged recently. However, practice is the key, and how to apply them to your own models is crucial. This article introduces this issue. Recently in NLP, various pre-trained language models like ELMO, GPT, … Read more

BERT-of-Theseus: A Model Compression Method Based on Module Replacement

BERT-of-Theseus: A Model Compression Method Based on Module Replacement

©PaperWeekly Original · Author|Su Jianlin School|Zhuiyi Technology Research Direction|NLP, Neural Networks Recently, I learned about a BERT model compression method called “BERT-of-Theseus”, derived from the paper BERT-of-Theseus: Compressing BERT by Progressive Module Replacing. This is a model compression scheme built on the concept of “replaceability”. Compared to conventional methods like pruning and distillation, it appears … Read more

When BERT Meets Knowledge Graphs

When BERT Meets Knowledge Graphs

Author: Gao Kaiyuan School: Shanghai Jiao Tong University Research Direction: Natural Language Processing Zhihu Column: BERT on the Shoulders of Giants Original Article Link: https://zhuanlan.zhihu.com/p/91052495 Introduction In the previous blog, I discussed some knowledge representation learning models. Today, let’s explore the current most popular BERT model and how it develops with the addition of external … Read more

Understanding BERT: Principles, Code, Models, and Fine-tuning Techniques

Understanding BERT: Principles, Code, Models, and Fine-tuning Techniques

In October 2018, the BERT model launched by Google made a stunning impact, sweeping various rankings and even surpassing human baseline scores, achieving a milestone breakthrough in the field of NLP. Today, for NLP algorithm engineers, BERT has become an essential tool. “What if there’s too little data?” — “Just fine-tune BERT!” “What if RNN … Read more

Is BERT Perfect? Do Language Models Truly Understand Language?

Is BERT Perfect? Do Language Models Truly Understand Language?

Machine Heart Release Author: Tony, Researcher at Zhuiyi Technology AI Lab Everyone knows that language models like BERT have been widely used in natural language processing. However, a question sometimes arises: do these language models truly understand language? Experts and scholars have different opinions on this. The author of this article elaborates on this topic … Read more

How Well Can BERT Solve Elementary Math Problems?

How Well Can BERT Solve Elementary Math Problems?

©PaperWeekly Original · Author|Su Jianlin Unit|Zhuiyi Technology Research Direction|NLP, Neural Networks ▲ The Years of “Chickens and Rabbits in the Same Cage” “Profit and loss problems”, “age problems”, “planting trees problems”, “cows eating grass problems”, “profit problems”… Have you ever been tormented by various types of math word problems during elementary school? No worries, machine … Read more

Innovations in the Era of BERT: Comparison of BERT Application Models and More

Innovations in the Era of BERT: Comparison of BERT Application Models and More

Author: Dr. Zhang Junlin, Senior Algorithm Expert at Sina Weibo Zhihu Column:Notes on the Frontiers of Deep Learning This article has been authorized, you can click “Read the original” at the end of the article to go directly: https://zhuanlan.zhihu.com/p/65470719 In the past two months, I have been paying close attention to the current application status … Read more

How BERT Understands Language: Google’s LIT Interactive Platform

How BERT Understands Language: Google's LIT Interactive Platform

New Intelligence Report Editor: QJP [New Intelligence Guide] As NLP models become increasingly powerful and are deployed in real-world scenarios, understanding the predictions made by these models becomes more crucial. Recently, Google released a new language interpretability tool (LIT), which is a new approach to explain and analyze NLP models, making their results less of … Read more

Summary of BERT Related Papers, Articles, and Code Resources

Summary of BERT Related Papers, Articles, and Code Resources

BERT has been very popular recently, so let’s gather some related resources, including papers, code, and article interpretations. 1. Official Google resources: 1) BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding Everything started with this paper released by Google in October, which instantly ignited the entire AI community, including social media: https://arxiv.org/abs/1810.04805 2) GitHub: … Read more