ALBERT: A Lightweight BERT That Is Both Light and Effective
Follow our WeChat public account “ML_NLP“ Set as “Starred“, delivering heavy content to you first! Today, we are reading the 2019 paper by Google titled “ALBERT: A Lite BERT for Self-Supervised Learning of Language Representations”. We know that the model’s performance improves with increased depth, but deeper models also make training more difficult. To address … Read more