Using RNN for Long-Term Time Series Forecasting

Using RNN for Long-Term Time Series Forecasting

↑↑↑ Follow for “Star Mark“Kaggle Competition Guide Kaggle Competition Guide Author: Fareise, excerpted from Yuan Yuan’s Algorithm Notes Using RNN for Long-Term Time Series Forecasting, Is It Better Than Transformer SOTA? This article introduced today comes from South China University of Technology, proposing an RNN-based long-term time series forecasting model that outperforms the SOTA Transformer … Read more

Animated RNN, LSTM, and GRU Computation Process

Animated RNN, LSTM, and GRU Computation Process

Source | Zhihu Author | JerryFly Link | https://zhuanlan.zhihu.com/p/115823190 Editor | Deep Learning Matters WeChat Official Account This article is for academic exchange only. If there is any infringement, please contact us for deletion. RNN is commonly used to handle sequential problems. This article demonstrates the computation process of RNN using animated graphics. The three … Read more

A Simple Overview of Attention Mechanism

A Simple Overview of Attention Mechanism

Click the “AI Park” above, follow the public account, and choose to add “Star” or “Top”. Author: Synced Compiled by: ronghuaiyang Introduction The attention mechanism is neither mysterious nor complex. It is simply an interface composed of parameters and mathematics. You can insert it anywhere appropriate, and it may enhance the results. What is Attention? … Read more

Introduction to Deep Learning Models: CNN and RNN

Introduction to Deep Learning Models: CNN and RNN

Author: Huang Yu, Autonomous Driving Scientist Editor: Hoh Xil Source: Huang Yu@Zhihu Produced by: DataFunTalk Note: There is a latest autonomous driving salon at the end of the article, welcome to sign up. Introduction: Deep learning has been “hot” for more than ten years since 2006, and the most common applications we see are in … Read more

Discussing the Gradient Vanishing/Explosion Problem in RNNs

Discussing the Gradient Vanishing/Explosion Problem in RNNs

Follow the public account “ML_NLP“ Set as “Starred“, delivering heavyweight content to you first! Reprinted from | PaperWeekly ©PaperWeekly Original · Author|Su Jianlin Unit|Zhuiyi Technology Research Direction|NLP, Neural Networks Although Transformer models have conquered most fields in NLP, RNN models like LSTM and GRU still hold unique value in certain scenarios, making it worthwhile for … Read more

Overview of Dropout Application in RNNs

Overview of Dropout Application in RNNs

[Introduction] This article provides the background and overview of Dropout, as well as a parameter analysis of its application in language modeling using LSTM/GRU recurrent neural networks. Author|Adrian G Compiled by|Zhuanzhi Organized by|Yingying Dropout Inspired by the role of gender in evolution, Hinton et al. first proposed Dropout, which temporarily removes units from the neural … Read more

Solving the Vanishing Gradient Problem in RNNs

Solving the Vanishing Gradient Problem in RNNs

Click the above “MLNLP” to select the “star” public account Essential content delivered promptly Author: Yulin Ling CS224N(1.29)Vanishing Gradients, Fancy RNNs Vanishing Gradient The figure below is a more vivid example. Suppose we need to predict the next word after the sentence “The writer of the books”. Due to the vanishing gradient, the influence of … Read more

Understanding Recurrent Neural Networks (RNNs)

Understanding Recurrent Neural Networks (RNNs)

↑↑↑ Follow “Star Mark” Datawhale Daily Insights & Monthly Study Groups, Don’t Miss Out Datawhale Insights Focus: Neural Networks, Source: Artificial Intelligence and Algorithm Learning Neural networks are the carriers of deep learning, and among neural network models, the most classic non-RNN model belongs here. Although it is not perfect, it possesses the ability to … Read more