Exploring LSTM: From Basic Concepts to Internal Structures

Exploring LSTM: From Basic Concepts to Internal Structures

Compiled and Organized by Annie Ruo Po | QbitAI WeChat Official Account Author Bio: Edwin Chen, researching mathematics/linguistics at MIT, speech recognition at Microsoft Research, quantitative trading at Clarium, advertising at Twitter, and machine learning at Google. In this article, the author first introduces the basic concepts of neural networks, RNNs, and LSTMs, then compares … Read more

Battery Management System Based on Neural Network Algorithm

Battery Management System Based on Neural Network Algorithm

This team consists of 6 students from the School of Electrical Engineering and the School of Science at Xi’an Jiaotong University (Wang Ziqiao, Li Dong, Zhao Hongfei, Wang Ning, Zhang Ranjie, Huang Ye), who have conducted a series of studies aimed at improving the calculation accuracy of the battery state of charge (SOC) in battery … Read more

Deep Learning for Stock Pricing

Deep Learning for Stock Pricing

Report from the Quantitative Investment and Machine Learning WeChat Public Account Editorial Department Unauthorized reproduction is prohibited Speech Overview The 2021 World Artificial Intelligence Conference was held from July 8 to 10, 2021, at the Shanghai Expo Center and the Shanghai Expo Exhibition Hall simultaneously. Since its inception in 2018, the World Artificial Intelligence Conference … Read more

Complete Notes on Andrew Ng’s deeplearning.ai Courses

Complete Notes on Andrew Ng's deeplearning.ai Courses

Source: Machine Heart This article contains 3744 words, and is recommended for a reading time of 8 minutes. Through this article, we will explain how to build models for natural language, audio, and other sequential data. Since Andrew Ng released the deeplearning.ai courses, many learners have completed all the specialized courses and meticulously created course … Read more

Further Improvements to GPT and BERT: Language Models Using Transformers

Further Improvements to GPT and BERT: Language Models Using Transformers

Selected from arXiv Authors: Chenguang Wang, Mu Li, Alexander J. Smola Compiled by Machine Heart Participation: Panda BERT and GPT-2 are currently the two most advanced models in the field of NLP, both adopting a Transformer-based architecture. A recent paper from Amazon Web Services proposed several new improvements to Transformers, including architectural enhancements, leveraging prior … Read more

Quantitative Assessment of VR Cybersickness Using EEG Signals

Quantitative Assessment of VR Cybersickness Using EEG Signals

Recently, the research team proposed a method for assessing the severity of VR cybersickness using electroencephalogram (EEG) signals, published in “Displays”. This study, titled “Exploring quantitative assessment of cybersickness in virtual reality using EEG signals and a CNN-ECA-LSTM network”, induces various degrees of cybersickness through visual stimulation paradigms and collects corresponding EEG data. It extracts … Read more

Future Frame Prediction in 2D Movie MR Images Using PCA and RNN

Future Frame Prediction in 2D Movie MR Images Using PCA and RNN

✅Author Bio: A Matlab simulation developer passionate about research, skilled in data processing, modeling simulation, program design, complete code acquisition, paper reproduction, and scientific simulation. 🍎Personal Homepage:Matlab Research Studio 🍊Personal Motto: Seek knowledge through inquiry; feel free to message me for help. 🔥 Content Introduction Abstract: Respiratory motion in medical imaging is a critical factor … Read more

RNN Transformation Mechanism and Practical Applications

RNN Transformation Mechanism and Practical Applications

Follow Hui Kuo Technology to learn more about technology knowledge Hello everyone, I am Liu Zenghui! Today we will continue the series of lectures on artificial intelligence neural networks, focusing on the transformation mechanism and practical applications of RNNs, exploring how they are widely used in various fields! Transformation Mechanism of RNN In the previous … Read more

Do RNN and LSTM Have Long-Term Memory?

Do RNN and LSTM Have Long-Term Memory?

This article introduces the ICML 2020 paper “Do RNN and LSTM have Long Memory?“. The authors of the paper are from Huawei Noah’s Ark Lab and the University of Hong Kong.. Author | Noah’s Ark Lab Editor | Cong Mo Paper link: https://arxiv.org/abs/2006.03860 1 Introduction To overcome the difficulties of Recurrent Neural Networks (RNNs) in … Read more

When RNN Meets NER: Bi-LSTM, CRF, and Stack LSTM

When RNN Meets NER: Bi-LSTM, CRF, and Stack LSTM

Author: David9 Address: http://nooverfit.com/ Named Entity Recognition (NER) is an important topic in semantic understanding. NER is like object detection in the field of natural language. Finding noun entities in document D is not enough; in many cases, we need to understand whether this noun represents a location, person, or organization, etc.: The above figure … Read more