Redefining NLP Rules: From Word2Vec and ELMo to BERT

Redefining NLP Rules: From Word2Vec and ELMo to BERT

Introduction Remember not long ago in the field of machine reading comprehension, where Microsoft and Alibaba surpassed humans on SQuAD with R-Net+ and SLQA respectively, and Baidu topped the MS MARCO leaderboard with V-Net while exceeding human performance on BLEU? These networks can be said to be increasingly complex, and it seems that the research … Read more

Can NLP Work Like the Human Brain? Insights from CMU and MIT

Can NLP Work Like the Human Brain? Insights from CMU and MIT

Analyst Network of Machine Heart Analyst: Wu Jiying Editor:Joni Zhong As an important research topic in the fields of computer science and artificial intelligence, Natural Language Processing (NLP) has been extensively studied and discussed across various domains. With the deepening of research, some scholars have begun to explore whether there are connections between natural language … Read more

Detailed Explanation of HuggingFace BERT Source Code

Detailed Explanation of HuggingFace BERT Source Code

Follow the official account “ML_NLP“ Set as “Starred“, heavy content delivered first-hand! Reprinted from | PaperWeekly ©PaperWeekly Original · Author | Li Luoqiu School | Master’s Student at Zhejiang University Research Direction | Natural Language Processing, Knowledge Graphs This article records my understanding of the code in the HuggingFace open-source Transformers project. As we all … Read more

BERT Lightweight: Optimal Parameter Subset Bort at 16% Size

BERT Lightweight: Optimal Parameter Subset Bort at 16% Size

Zheng Jiyang from Aofeisi QbitAI Report | WeChat Official Account QbitAI Recently, the Amazon Alexa team released a research achievement: researchers performed parameter selection on the BERT model, obtaining the optimal parameter subset of BERT—Bort. The research results indicate that Bort is only 16% the size of BERT-large, but its speed on CPU is 7.9 … Read more

EdgeBERT: Limit Compression, 13 Times Lighter Than ALBERT!

EdgeBERT: Limit Compression, 13 Times Lighter Than ALBERT!

Machine Heart Reprint Source: Xixiaoyao’s Cute Selling House Author: Sheryc_Wang Su There are two types of highly challenging engineering projects in this world: the first is to maximize something very ordinary, like expanding a language model to write poetry, prose, and code like GPT-3; while the other is exactly the opposite, to minimize something very … Read more