Utilizing the computational prowess of one of the world’s top supercomputers, scientists have achieved the most accurate simulation to date of objects consisting of tens of millions of atoms, thanks to the integration of artificial intelligence (AI) techniques. Previous simulations that delved into the behavior and interaction of atoms were limited to small molecules due to the immense computational power required. Although there are methods to simulate larger atom counts over time, they heavily rely on approximations and fail to provide intricate molecular details.

A team led by Boris Kozinsky at Harvard University has developed a tool named Allegro, which leverages AI to perform precise simulations of systems containing tens of millions of atoms. To demonstrate the capabilities of their approach, Kozinsky and his team employed Perlmutter, the world’s eighth most powerful supercomputer, to simulate the complex interplay of 44 million atoms constituting the protein shell of HIV. Additionally, they successfully simulated other vital biological molecules such as cellulose, a protein associated with haemophilia, and a widespread tobacco plant virus.

Kozinsky emphasizes that this methodology can accurately simulate any atom-based object with exceptional precision and scalability. The system’s applications extend beyond biology and can be applied to a wide array of materials science problems, including investigations into batteries, catalysis, and semiconductors.

To tackle simulations involving such vast numbers of particles, the researchers utilized a specific type of AI known as a neural network. By incorporating symmetrical interactions between atoms from all angles—a concept called equivariance—into the network’s design, the team achieved significant improvements in accuracy and other crucial properties, such as simulation stability and learning speed as more data is introduced.

Albert Musaelian, a member of the research team, explains, “When you develop networks that inherently incorporate these symmetries… you observe substantial enhancements in accuracy and other properties that are crucial to us, such as simulation stability or the rate at which the machine learning model learns as it is trained with more data.” Gábor Csányi from the University of Cambridge describes the achievement as a remarkable feat in programming, demonstrating the scalability of machine-learned potentials.

While simulating biological molecules on such a large scale serves as a proof-of-concept for the tool’s capabilities, Csányi notes that it does not offer substantial practical benefits for biochemists who already possess faster and highly accurate simulation tools. However, he highlights the potential value of the approach in studying materials consisting of numerous atoms subjected to extreme forces and shocks within short timescales, such as planetary cores.

By Impact Lab