Hugging Face Official Course Launched: Free NLP Training

Machine Heart reports
Editor: Du Wei

The Hugging Face NLP course is now live, and all courses are completely free.

Those in the NLP field should be very familiar with the renowned Hugging Face, a startup focused on solving various NLP problems that has brought many beneficial technical achievements to the community. Last year, the team’s Transformers codebase paper won the Best Demo Award at EMNLP 2020. In April this year, the team released a new PyTorch library called “Accelerate” for multi-GPU, TPU, and mixed precision training.
Recently, Hugging Face announced the launch of the first part of their NLP course on their official Twitter, covering how to systematically use various Hugging Face libraries (i.e., Transformers, Datasets, Tokenizers, and Accelerate) and various models in the Hugging Face Hub. More importantly, all courses are completely free and ad-free.

Hugging Face Official Course Launched: Free NLP Training

Course homepage: https://huggingface.co/course/chapter0?fw=pt
The entire series of courses is divided into Introduction, Diving in, and Advanced levels, as follows:
  • Introduction: Transformer models, using Transformers, fine-tuning pre-trained models, and sharing models and tokenizers;

  • Diving in: Datasets library, Tokenizers library, major NLP tasks, and how to seek help;

  • Advanced: Specialized architectures, accelerated training, custom training loops, and contributing to Hugging Face.

Hugging Face Official Course Launched: Free NLP Training

Hugging Face is launching the Introduction Course. Through the content explained in four chapters, learners will learn to use the pipeline function to solve NLP tasks such as text generation and classification, understand the Transformer architecture, and distinguish between encoder, decoder, and encoder-decoder architectures and use cases.
Meanwhile, all libraries used in the course can be obtained through Python packages. Learners first need to set up their Python environment and install the dedicated libraries, and they can use Colab notebooks or Python virtual environments to set up their working environment. Please refer to the “setup” section on the course homepage for specific operations.
Course Overview
Chapters 1 to 4 (Introduction) introduce the main concepts of the Transformers library. By the end of this part of the course, you will understand how the Transformer model works and how to use models from the Hugging Face Hub, fine-tune them on datasets, and share results on the Hub.
Chapters 5 to 8 (Diving in) introduce the basics of datasets and Tokenizers before delving into classic NLP tasks. By the end of this part, you will be able to solve the most common NLP problems on your own.
Chapters 9 to 12 (Advanced) will provide in-depth learning, showcasing specialized architectures (memory efficiency, long sequences, etc.) and teaching you how to write custom objects for use cases. By the end of this part, you will be able to tackle complex NLP problems.
The following are the contents of Chapters 1 to 4 (Introduction):

Hugging Face Official Course Launched: Free NLP Training

Hugging Face Official Course Launched: Free NLP Training

Hugging Face Official Course Launched: Free NLP Training

Hugging Face Official Course Launched: Free NLP Training

Note that to learn this course, you need to master the following knowledge:
  • Good knowledge of Python is required;

  • It is best to have completed basic deep learning courses, such as “Practical Deep Learning for Coders” or the deep learning courses from deeplearning.ai;

  • No prior knowledge of PyTorch or TensorFlow is needed, but familiarity with either will be helpful.

Instructor Introduction

Hugging Face Official Course Launched: Free NLP Training

Matthew Carrigan is a machine learning engineer at Hugging Face, previously a machine learning engineer at predictive analytics company Parse.ly and a postdoctoral researcher at Trinity College Dublin.
Lysandre Debut is also a machine learning engineer at Hugging Face and has been working on the Transformers library since its early days.
Sylvain Gugger is a research engineer at Hugging Face and one of the core maintainers of the Transformers library. Previously, he was a research scientist at the non-profit research organization fast.ai and co-authored the book “Deep Learning for Coders with fastai and PyTorch” with fast.ai founder Jeremy Howard. His research focuses on designing and improving techniques that enable models to train quickly on limited resources, making deep learning more accessible.

Amazon Cloud Technology China Summit

The 2021 Amazon Cloud Technology China Summit will be held in Shanghai, Beijing, and Shenzhen. This summit, themed “Building a New Pattern, Reshaping the Cloud Era,” will share the “building stories and reshaping experiences of the cloud era” with many leading technical practitioners in the industry.

On July 21-22, 2021, the Shanghai stop of the 2021 Amazon Cloud Technology China Summit will feature many heavyweight industry experts and industry leaders sharing unique industry insights on-site.

For developers, this summit will specifically set up a developer area and collaborate with various open-source communities like Apache, along with many open-source experts, to bring groundbreaking content sharing to developers!

For industries, nearly a hundred partners, customers, and Amazon Cloud Technology experts from various fields will form a strong guest lineup to bring you industry best practice sharing and leading technology achievements.

Scan the mini program below to register now.

Hugging Face Official Course Launched: Free NLP Training

© THE END

For reprint, please contact this public account for authorization

Submissions or requests for coverage: [email protected]

Leave a Comment