Progress on Neural Network Canonical Transformations

Canonical transformations are classical methods used by physicists, mechanical engineers, and astronomers to handle Hamiltonian systems. By finding suitable variable substitutions, canonical transformations can simplify, or even completely solve the dynamics of Hamiltonian systems. For instance, in the 19th century, French scientist Charles Delaunay published approximately 1800 pages of analytical derivations attempting to simplify the “Sun-Earth-Moon” three-body problem using canonical transformations. Although the canonical transformation method is a fundamental tool in Hamiltonian mechanics, its broad application in more complex many-body problems is limited by intricate manual operations and analytical calculations.
The close relationship between canonical transformations and the normalizing flow method in modern machine learning has led to the proposal of the Neural Canonical Transformation method by PhD students Li Shuohui, Dong Chenxiao, and researcher Wang Lei from the Key Laboratory of Condensed Matter Theory and Materials Computation at the Institute of Physics, Chinese Academy of Sciences, in collaboration with Dr. Zhang Linfeng from Princeton University. This method utilizes neural networks to achieve flexible and learnable canonical transformations. By learning from Hamiltonian functions or phase space data, suitable transformations are found, thereby mapping the problem to nearly independent motion modes with different characteristic frequencies.
In the field of machine learning, normalizing flows implement transformations between data using reversible deep neural networks. They have wide applications in real-world problems such as speech and image synthesis. Essentially, normalizing flows use variable substitutions to map the complex probability distributions of real data into simple approximate canonical distributions. Canonical transformations in physics are also a form of normalizing flow. However, unlike typical machine learning applications, canonical transformations operate in phase space that includes coordinates and momenta. Furthermore, to ensure that the form and physical meaning of the Hamiltonian equations are preserved before and after transformation, the transformation itself must satisfy the symplectic condition. Constructing neural networks that meet these fundamental mathematical properties of Hamiltonian systems is a key point of this work.
A direct application of neural network canonical transformations is to extract nearly independent nonlinear modes in many-body problems. This helps identify low-frequency collective modes that play a crucial role in molecular dynamics and dynamic control problems. For example, the authors used neural network canonical transformations to analyze molecular dynamics simulation data of alanine dipeptide, thereby extracting the motion modes corresponding to the transitions between molecular conformations. The authors also applied neural network canonical transformations to machine learning problems, achieving latent variable extraction and conceptual compression for the MNIST dataset.
This work was supported by the Ministry of Science and Technology (2016YFA0300603) and the National Natural Science Foundation of China (11774398).
Related research results were published in Phys. Rev. X, paper link
https://journals.aps.org/prx/abstract/10.1103/PhysRevX.10.021020
Open-source code implementation
https://github.com/li012589/neuralCT

Progress on Neural Network Canonical Transformations

Editor: Kun

Recent Popular Articles Top 10

↓ Click the title to view ↓
1. The epidemic that shocked the e-sports world 15 years ago, alarmed the CDC, and was published in a top journal
2. Why can airplanes fly? Scientists still have no answer to this day
3. The hard-core scientific guide to rice cooker cakes: from beginner to giving up
4. What changes happen to your body if you stay at home without moving?
5. Feynman: Is it better to study physics or mathematics?
6. Nash: How to scientifically pursue someone?
7. Why can’t egg whites whip up? Because you haven’t learned the principles of foaming well
8. Raisins are picked after drying? Unraveling childhood doubts about the nut processing assembly line
9. What did Newton do while at home during the pandemic?
10. Why! Am! I! Distracted again!
Click here to view all past popular articles

Progress on Neural Network Canonical Transformations

Leave a Comment