EdgeBERT: Limit Compression, 13 Times Lighter Than ALBERT!

EdgeBERT: Limit Compression, 13 Times Lighter Than ALBERT!

Machine Heart Reprint Source: Xixiaoyao’s Cute Selling House Author: Sheryc_Wang Su There are two types of highly challenging engineering projects in this world: the first is to maximize something very ordinary, like expanding a language model to write poetry, prose, and code like GPT-3; while the other is exactly the opposite, to minimize something very … Read more

BERT: Training Longer and with More Data to Return to SOTA

BERT: Training Longer and with More Data to Return to SOTA

Machine Heart Report Contributors: Si Yuan, Qian Zhang The championship throne of XLNet has not yet warmed up, and the plot has once again taken a turn. Last month, XLNet comprehensively surpassed BERT on 20 tasks, creating a new record for NLP pre-training models and enjoyed a moment of glory. However, now, just a month … Read more

Further Improvements to GPT and BERT: Language Models Using Transformers

Further Improvements to GPT and BERT: Language Models Using Transformers

Selected from arXiv Authors: Chenguang Wang, Mu Li, Alexander J. Smola Compiled by Machine Heart Participation: Panda BERT and GPT-2 are currently the two most advanced models in the field of NLP, both adopting a Transformer-based architecture. A recent paper from Amazon Web Services proposed several new improvements to Transformers, including architectural enhancements, leveraging prior … Read more

LRC-BERT: Contrastive Learning for Knowledge Distillation

LRC-BERT: Contrastive Learning for Knowledge Distillation

New Intelligence Report Author: Gaode Intelligent Technology Center [New Intelligence Guide]The research and development team of Gaode Intelligent Technology Center designed a contrastive learning framework for knowledge distillation in their work, and proposed COS-NCE LOSS based on this framework. This paper has been accepted by AAAI 2021. NLP (Natural Language Processing) plays an important role … Read more

BERT’s Amazing Applications in NLP and Law

BERT's Amazing Applications in NLP and Law

Original by Machine Heart Author: Zeng Xiangji Editor:Hao Wang At the 2019 ACM Turing Conference, Professor Zhu Songchun (UCLA) and Dr. Shen Xiangyang (Microsoft Global Executive Vice President) discussed the topic of “Path Choices in the Age of Artificial Intelligence“. Dr. Shen believes that the development of artificial intelligence will usher in a golden decade … Read more

Step-By-Step Guide to Sentence Classification Using BERT

Step-By-Step Guide to Sentence Classification Using BERT

Produced by Big Data Digest Source:GitHub Compiled by:LYLM, Wang Zhuanzhuan, Li Lei, Qian Tianpei In recent years, the development of machine learning language processing models has progressed rapidly, moving beyond the experimental stage and into application in some advanced electronic products. For example, Google recently announced that the BERT model has become the main driving … Read more

BERT Model Compression Based on Knowledge Distillation

BERT Model Compression Based on Knowledge Distillation

Big Data Digest authorized reprint from Data Pie Compiled by:Sun Siqi, Cheng Yu, Gan Zhe, Liu Jingjing In the past year, there have been many breakthrough advancements in the research of language models, such as GPT, which generates sentences that are convincingly realistic [1]; BERT, XLNet, RoBERTa [2,3,4], etc., have swept various NLP rankings as … Read more

Innovations in the Era of BERT: Progress in Applications Across NLP Fields

Innovations in the Era of BERT: Progress in Applications Across NLP Fields

Machine Heart Column Author: Zhang Junlin BERT has brought great surprises to people, but in the blink of an eye, about half a year has passed, and during this time, many new works related to BERT have emerged. In recent months, aside from my main work on recommendation algorithms, I have been quite curious about … Read more

Challenges of Training BERT and ViT with Single GPU in One Day

Challenges of Training BERT and ViT with Single GPU in One Day

Pine from Aofeisi Quantum Bit | Official Account QbitAI What can you achieve by training BERT on a single GPU in just one day? Now, researchers have finally done this, exploring the true performance of language models under limited computational conditions. In the past, most professionals focused on the performance of language models under extreme … Read more

Not Just BERT! Top 10 Exciting Ideas in NLP for 2018

Not Just BERT! Top 10 Exciting Ideas in NLP for 2018

Author: Sebastian Ruder, Translated by QbitAI 2018 was a significant year for the field of NLP. The most notable was BERT, which swept various NLP tests and was hailed as the beginning of a new era in NLP. But in 2018, there was more than just BERT. Recently, Irish NLP researcher Sebastian Ruder wrote an … Read more