Renormalization group is a fundamental concept in physics research. It is not only a powerful tool for studying phase transitions and critical phenomena, as well as strong coupling problems, but it also shapes physicists’ worldview: physics is an effective theory about the emergence of phenomena at different scales and energy levels.
In the practical applications of deep learning, it has been observed that deep neural networks have the ability to extract features layer by layer. Neurons in the deeper layers of the network often correspond to abstract and independent emergent concepts. Understanding and creatively utilizing this characteristic of neural networks is one of the core issues in deep learning, known as Representation Learning. Conducting cross-research between representation learning and renormalization group theory helps reveal the working principles of deep neural networks and allows successful deep learning techniques to be applied to solve physical problems.
Recently, doctoral student Li Shuohui and associate researcher Wang Lei from the Condensed Matter Theory and Materials Computation Key Laboratory of the Institute of Physics, Chinese Academy of Sciences proposed a type of multi-scale neural network architecture based on Normalizing Flows. This type of network maps the probability distribution of microscopic physical configurations to a near-Gaussian distribution in latent variable space. Conversely, due to the reversibility of the network, it can also directly generate possible physical configurations from Gaussian noise. The authors start from the bare action of a physical problem and train the network end-to-end using the variational principle. To fully leverage the capabilities of representation learning, they further propose a Hamiltonian Monte Carlo update algorithm in latent variable space. The structural design of this neural network is inspired by the Multiscale Entanglement Renormalization Ansatz (MERA) used in quantum many-body physics research. Mathematically, the neural network renormalization group is equivalent to adaptively seeking a nonlinear wavelet transform that reduces the mutual information between variables.
The normalizing flow directed toward latent variable space helps automatically identify collective variables and effective theories, and is expected to play a role in statistical physics, field theory, and first-principles molecular dynamics research. Moreover, the reversible renormalizing flow aligns perfectly with the modern developments in renormalization group theory: the holographic renormalization that preserves information. Therefore, the neural network renormalization group also provides a new avenue for studying the Holographic Duality Principle. This work was recently published in Physical Review Letters (Phys. Rev. Lett. 121, 260601 (2018)).
This work was supported by the Ministry of Science and Technology (2016YFA0300603) and the National Natural Science Foundation of China (11774398). For further understanding of this work, you can refer to Wang Lei’s report at the American Physical Society editorial meeting “Physics Next: Machine Learning” and the open-source program implementation released by the authors.
Article link
Normalizing flows achieve a reversible mapping between physical variables and latent variables.
Editor: Cloudiiink
Recent Popular Articles Top 10
↓ Click the title to view ↓
1. Do girls not feel cold in winter wearing so little? I had a boy try it…
2. Serious Science Popularization: How to Distinguish Ultraman
3. Humans spent a hundred years planning and designing a railway network, while these brainless creatures finished it in 26 hours
4. Top 10 cold scientific knowledge of 2018
5. Understand quantum mechanics in one picture (IV)
6. Physics World selects the top ten scientific breakthroughs of 2018
7. Last night, I lost to a sofa on Akina Mountain…
8. If someone told you that the essence of the world is a spring, would you believe it?
9. Can a magnet have only one pole? It may open the door to the theory of everything
10. Brake uphill; accelerate downhill, how much do you know about strange slopes?
Click here to view all past popular articles