Revisiting Transformer: Inversion More Effective, New SOTA for Real-World Prediction

Revisiting Transformer: Inversion More Effective, New SOTA for Real-World Prediction

The Transformer has shown strong capabilities in time series forecasting, capable of describing pairwise dependencies and extracting multi-level representations from sequences. However, researchers have also questioned the effectiveness of Transformer-based predictors. These predictors often embed multiple variables at the same timestamp into indistinguishable channels and attend to these time tokens to capture temporal dependencies. Considering … Read more

Time Series + Transformer: Understanding iTransformer

Time Series + Transformer: Understanding iTransformer

This article is about 3500 words long and is recommended to be read in 10 minutes. This article will help you understand iTransformer and better utilize the attention mechanism for multivariate correlation. 1 Introduction Transformers perform excellently in natural language processing and computer vision, but they do not perform as well as linear models in … Read more

Transformer Returns! Leading Time Series Prediction Without Module Modifications

Transformer Returns! Leading Time Series Prediction Without Module Modifications

Source: New Intelligence [ Introduction ] Recently, researchers from Tsinghua University and Ant Group re-examined the application of the Transformer structure in time series analysis, proposing a completely new inverted perspective—achieving comprehensive leadership in time series prediction tasks without modifying any modules! In recent years, Transformers have made continuous breakthroughs in natural language processing and … Read more

Breakthrough LSTM! Multivariate Data Anomaly Detection with LSTM and KNN

Breakthrough LSTM! Multivariate Data Anomaly Detection with LSTM and KNN

Hello, I am Xiao Chu~ Today we are going to talk about a topic: using LSTM and KNN for multivariate data anomaly detection. Related Principles Multivariate Data Anomaly Detection is a very important task, especially in complex contexts such as high-dimensional data and time series data. Traditional anomaly detection methods (such as those based on … Read more

DA-RNN: Recurrent Neural Network Based on Two-Stage Attention Mechanism

DA-RNN: Recurrent Neural Network Based on Two-Stage Attention Mechanism

Author: Occam’s Razor Personal Blog: https://blog.csdn.net/yilulvxing Paper Link: https://arxiv.org/abs/1704.02971 Github Code Link: https://github.com/LeronQ/DA-RNN The paper is titled “Recurrent Neural Network Based on Two-Stage Attention Mechanism”. Essentially, the article is based on the Seq2Seq model, combined with an attention mechanism to realize time series prediction methods. A major highlight of the article is that it introduces … Read more

Volatility Prediction: CNN-Based Image Recognition Strategy (With Code)

Volatility Prediction: CNN-Based Image Recognition Strategy (With Code)

Star ★TopPublic AccountLove you all♥ Author: Chuan Bai Translated by: 1+1=6 1 Introduction The financial market mainly deals with time series problems, and there are numerous algorithms and tools around time series forecasting. Today, we use CNN for regression-based prediction and compare it with some traditional algorithms to see how it performs. We focus on … Read more

Lag-Llama: Probabilistic Time Series Forecasting with Foundation Models

Lag-Llama: Probabilistic Time Series Forecasting with Foundation Models

Abstract arXiv:2310.08278v3 [cs.LG] February 8, 2024 Original paper link: https://arxiv.org/pdf/2310.08278 In recent years, foundation models have caused a paradigm shift in the field of machine learning due to their unprecedented zero-shot and few-shot generalization capabilities. However, despite the success of foundation models in fields such as natural language processing and computer vision, the development of … Read more