How Graphics Cards Empower Artificial Intelligence

How Graphics Cards Empower Artificial Intelligence

Click the blue text“Jiu Bian” to follow me. Today I want to discuss a very niche topic—graphics cards and artificial intelligence. As you may have noticed, the 2023 World Artificial Intelligence Conference (WAIC) is being held in Shanghai these days. During this conference, Elon Musk opened the event, and AMD’s Lisa Su, along with numerous … Read more

OpenAI Secrets Revealed: A Guide to Training Large Neural Networks

OpenAI Secrets Revealed: A Guide to Training Large Neural Networks

Introduction Want to know how those large-scale neural networks are trained? OpenAI summarizes in an article: Besides needing many GPUs, algorithms are also crucial! Many advancements in AI today are attributed to large neural networks, especially the pre-trained models provided by large companies and research institutions, which have significantly propelled the progress of downstream tasks. … Read more

Understanding PyTorch Distributed Training

Understanding PyTorch Distributed Training

Follow us on WeChat “ML_NLP“ Add us to your “Favorites“, get heavy content delivered to you first! Source丨SenseTime Academic Editor丨Jishi Platform 0 Introduction With the widespread adoption of large-scale machine learning, the emergence of ultra-large deep learning models, and the rapid development of distributed learning methods such as federated learning, distributed machine learning model training … Read more

PyTorch Multiprocessing Tutorial

PyTorch Multiprocessing Tutorial

Click on the above“Mechanical and Electronic Engineering Technology” to follow us Multiprocessing is a term in computer science that refers to running multiple processes simultaneously, where these processes can execute different tasks at the same time. In computer operating systems, a process is the basic unit of resource allocation, and each process has its own … Read more

Exploring Parallel Computation in Non-Linear RNNs

Exploring Parallel Computation in Non-Linear RNNs

©PaperWeekly Original · Author | Su Jianlin Affiliation | Scientific Space Research Direction | NLP, Neural Networks In recent years, linear RNNs have attracted some researchers’ attention due to their characteristics such as parallel training and constant inference costs (for example, my previous article titled “Google’s New Work Attempts to ‘Revive’ RNN: Can RNN Shine … Read more