LRC-BERT: Contrastive Learning for Knowledge Distillation

LRC-BERT: Contrastive Learning for Knowledge Distillation

New Intelligence Report Author: Gaode Intelligent Technology Center [New Intelligence Guide]The research and development team of Gaode Intelligent Technology Center designed a contrastive learning framework for knowledge distillation in their work, and proposed COS-NCE LOSS based on this framework. This paper has been accepted by AAAI 2021. NLP (Natural Language Processing) plays an important role … Read more

BERT’s Amazing Applications in NLP and Law

BERT's Amazing Applications in NLP and Law

Original by Machine Heart Author: Zeng Xiangji Editor:Hao Wang At the 2019 ACM Turing Conference, Professor Zhu Songchun (UCLA) and Dr. Shen Xiangyang (Microsoft Global Executive Vice President) discussed the topic of “Path Choices in the Age of Artificial Intelligence“. Dr. Shen believes that the development of artificial intelligence will usher in a golden decade … Read more

Step-By-Step Guide to Sentence Classification Using BERT

Step-By-Step Guide to Sentence Classification Using BERT

Produced by Big Data Digest Source:GitHub Compiled by:LYLM, Wang Zhuanzhuan, Li Lei, Qian Tianpei In recent years, the development of machine learning language processing models has progressed rapidly, moving beyond the experimental stage and into application in some advanced electronic products. For example, Google recently announced that the BERT model has become the main driving … Read more

BERT Model Compression Based on Knowledge Distillation

BERT Model Compression Based on Knowledge Distillation

Big Data Digest authorized reprint from Data Pie Compiled by:Sun Siqi, Cheng Yu, Gan Zhe, Liu Jingjing In the past year, there have been many breakthrough advancements in the research of language models, such as GPT, which generates sentences that are convincingly realistic [1]; BERT, XLNet, RoBERTa [2,3,4], etc., have swept various NLP rankings as … Read more

Innovations in the Era of BERT: Progress in Applications Across NLP Fields

Innovations in the Era of BERT: Progress in Applications Across NLP Fields

Machine Heart Column Author: Zhang Junlin BERT has brought great surprises to people, but in the blink of an eye, about half a year has passed, and during this time, many new works related to BERT have emerged. In recent months, aside from my main work on recommendation algorithms, I have been quite curious about … Read more

Challenges of Training BERT and ViT with Single GPU in One Day

Challenges of Training BERT and ViT with Single GPU in One Day

Pine from Aofeisi Quantum Bit | Official Account QbitAI What can you achieve by training BERT on a single GPU in just one day? Now, researchers have finally done this, exploring the true performance of language models under limited computational conditions. In the past, most professionals focused on the performance of language models under extreme … Read more

Not Just BERT! Top 10 Exciting Ideas in NLP for 2018

Not Just BERT! Top 10 Exciting Ideas in NLP for 2018

Author: Sebastian Ruder, Translated by QbitAI 2018 was a significant year for the field of NLP. The most notable was BERT, which swept various NLP tests and was hailed as the beginning of a new era in NLP. But in 2018, there was more than just BERT. Recently, Irish NLP researcher Sebastian Ruder wrote an … Read more

Comparative Evaluation of BERT and ERNIE in NLP

Comparative Evaluation of BERT and ERNIE in NLP

Yunzhong from Aofeisi Quantum Bit Report | WeChat Official Account QbitAI How do BERT and ERNIE, the two most关注的 models in the NLP field, perform? Recently, someone conducted a comparison test, and the results were surprising and delightful in a Chinese language environment. What are the specific details? Let’s take a look at this technical … Read more

Innovations in the Era of BERT: Applications of BERT in NLP

Innovations in the Era of BERT: Applications of BERT in NLP

Article Author: Zhang Junlin, Senior Algorithm Expert at Weibo AI Lab Content Source: Deep Learning Frontier Notes @ Zhihu Column Community Production: DataFun Note: You are welcome to leave messages in the background to submit to DataFun. BERT has brought great surprises to people, but it has been about half a year since then, and … Read more

Understanding BERT Transformer: More Than Just Attention Mechanism

Understanding BERT Transformer: More Than Just Attention Mechanism

Big Data Digest and Baidu NLP Jointly Produced Author: Damien Sileo Translators: Zhang Chi, Yi Hang, Long Xin Chen BERT is a natural language processing model recently proposed by Google, which performs exceptionally well in many tasks such as question answering, natural language inference, and paraphrasing, and it is open-source. Therefore, it is very popular … Read more