DeepMind’s CoBERL Agent Enhances Data Efficiency Using LSTM and Transformer

DeepMind's CoBERL Agent Enhances Data Efficiency Using LSTM and Transformer

Selected from arXiv Authors: Andrea Banino et al. Compiled by Machine Heart Editors: Chen Ping, Du Wei Researchers from DeepMind proposed the CoBERL agent for reinforcement learning, which combines a new contrastive loss with a hybrid LSTM-transformer architecture to improve data processing efficiency. Experiments show that CoBERL can continuously improve performance across the entire Atari … Read more