BERT Lightweight: Optimal Parameter Subset Bort at 16% Size

BERT Lightweight: Optimal Parameter Subset Bort at 16% Size

Zheng Jiyang from Aofeisi QbitAI Report | WeChat Official Account QbitAI Recently, the Amazon Alexa team released a research achievement: researchers performed parameter selection on the BERT model, obtaining the optimal parameter subset of BERT—Bort. The research results indicate that Bort is only 16% the size of BERT-large, but its speed on CPU is 7.9 … Read more

EdgeBERT: Limit Compression, 13 Times Lighter Than ALBERT!

EdgeBERT: Limit Compression, 13 Times Lighter Than ALBERT!

Machine Heart Reprint Source: Xixiaoyao’s Cute Selling House Author: Sheryc_Wang Su There are two types of highly challenging engineering projects in this world: the first is to maximize something very ordinary, like expanding a language model to write poetry, prose, and code like GPT-3; while the other is exactly the opposite, to minimize something very … Read more