Have You Read the Bert Source Code?

Have You Read the Bert Source Code?

Click the “MLNLP” above and select “Star” to follow the public account Heavyweight content delivered to you first Author:Old Song’s Tea Book Club Zhihu Column:NLP and Deep Learning Research Direction:Natural Language Processing Introduction A few days ago, during an interview, an interviewer directly asked me to analyze the source code of BERT. Emm, that was … Read more

12x Speedup for Bert Inference with One Line of Code

12x Speedup for Bert Inference with One Line of Code

MLNLP community is a well-known machine learning and natural language processing community in China and abroad, covering NLP master’s and doctoral students, university teachers, and corporate researchers. The Vision of the Community is to promote communication and progress between the academic and industrial communities of natural language processing and machine learning, especially for beginners. Reprinted … Read more

Master Bert Source Code in 10 Minutes (PyTorch Version)

Master Bert Source Code in 10 Minutes (PyTorch Version)

The application of Bert in production environments requires compression, which demands a deep understanding of the Bert structure. This repository will interpret the Bert source code (PyTorch version) step by step. The repository can be found at https://github.com/DA-southampton/NLP_ability Code and Data Introduction First, for the code, I referenced this repository. I directly cloned the code … Read more

Is BERT’s LayerNorm Different From What You Think?

Is BERT's LayerNorm Different From What You Think?

MLNLP(Machine Learning Algorithms and Natural Language Processing) is a well-known natural language processing community both domestically and internationally, covering NLP master’s and doctoral students, university professors, and corporate researchers. The vision of the community is to promote communication and progress between the academic and industrial sectors of natural language processing and machine learning, especially for … Read more

Common Pitfalls When Practicing BERT

Common Pitfalls When Practicing BERT

Follow the public account “ML_NLP“ Set it as “Starred“, delivering heavy content to you first! Source | Zhihu Address | https://zhuanlan.zhihu.com/p/69389583 Author | Lao Song’s Tea Book Club Editor | Machine Learning Algorithms and Natural Language Processing Public Account This article is for academic sharing only. If there is an infringement, please contact the backend … Read more

Deep Learning Model Training and Debugging: Efficient Tools and Concepts (Part 1)

Deep Learning Model Training and Debugging: Efficient Tools and Concepts (Part 1)

“IT has something to talk about” is a professional IT information and service platform under the Machinery Industry Press, dedicated to helping readers master more professional and practical knowledge and skills in the broad IT field, quickly enhancing their workplace competitiveness. Click the blue WeChat name to quickly follow us! PART1: Dataset In PyTorch, a … Read more

Stunning Ideas and Techniques in Deep Learning

Stunning Ideas and Techniques in Deep Learning

Hello everyone, I am Hua Ge. This article summarizes the stunning ideas, algorithms, and papers in deep learning. Stunning Ideas Attention Mechanism The core idea of the attention mechanism is to allow the model to allocate different attention weights based on the importance of the data while processing it. This mechanism enables the model to … Read more

Practical NLP: Chinese Named Entity Recognition

Practical NLP: Chinese Named Entity Recognition

Follow the WeChat official account “ML_NLP“ Set as “Starred“, heavy content delivered in real-time! Original Link:https://blog.csdn.net/MaggicalQ/article/details/88980534 Introduction This project will use pytorch as the main tool to implement different models (including HMM, CRF, Bi-LSTM, Bi-LSTM+CRF) to solve the problem of Chinese named entity recognition. The article will not involve too much mathematical derivation but will … Read more

Summary of Four Common NLP Frameworks

Summary of Four Common NLP Frameworks

Click on the “MLNLP” above, and select the “Star” public account Heavyweight content delivered to you first Reprinted from the public account: Harbin Institute of Technology SCIR Authors: Harbin Institute of Technology SCIR Di Donglin Liu Yuanxing Zhu Qingfu Hu Jingwen Introduction With the development of artificial intelligence, more and more deep learning frameworks have … Read more

How NLP Beginners Should Start in the Era of Large Models

The entry point is simple and straightforward: Build some essential foundation and then sprint into Transformers. In the era of large models, traditional algorithms such as word segmentation and part-of-speech tagging have been largely replaced, so there is no need to spend too much energy on traditional algorithms at the beginning stage. Mathematics and Programming … Read more