Physics and Chemistry Underpinning AI-Related Nobel Prizes

4 min read

| By Gale Staff |

The Royal Swedish Academy of Sciences awarded the 2024 Nobel Prizes in Physics and Chemistry for advances related to the development and use of artificial intelligence (AI).

Geoffrey E. Hinton, a British-Canadian professor at the University of Toronto, shared the physics award with Princeton professor John J. Hopfield for, as cited by the Nobel Committee, “foundational discoveries and inventions that enable machine learning with artificial neural networks.” For in depth coverage of AI, see the topic page on Artificial Intelligence in Gale In Context: Science.

Relying heavily on statistical techniques, Hinton invented a methodology that can independently discover properties in data. Having won the Turing Award in 2018, Hinton is also a figurehead for what is termed “doomerism,” a fear that artificial intelligence could create or facilitate catastrophic events, including human extinction. In speeches and interviews, Hinton has expressed fears that large language models could become smarter than their designers and develop behaviors contrary to the interests of humanity. Other scientists, including fellow AI scientist and Turing Award recipient, Yann LeCun, dismiss Hinton’s fears.

Hopfield is the inventor of the Hopfield network, a neural network utilizing backpropagation that can map patterns in data and reconstruct data. Akin to associative memory, Hopfield networks look for patterns in data. Hopfield networks can also be used to clean data of noise, fill holes in data, and recreate partially erased data.

Both Hinton and Hopfield’s work advanced machine learning which utilizes artificial neural networks to mimic functions such as memory and learning. Machine learning differs from traditional step-by-step computer programming. In machine learning, the system learns by example, enabling it to tackle complex problems where inputs and outcomes are less well defined.

When an AI machine is trained, scientists mimic the learning process related to the human brain’s neurons and synapses. An artificial neural network processes information while simultaneously revaluing connections between nodes of the system, thereby simulating a strengthening or weakening of connections between biological neurons and the synapses separating neurons.

For example, a simple Hopfield network has connected nodes that, by assigning values (e.g., 0 or 1) to the nodes, can represent connections of varying strength. Hopfield can then describe the overall state of the system with a spin energy–derived formula that incorporates not only the value of each node but the strength of connections among nodes. In essence, pattern matching is based on the overall energy state of the system rather than node-by-node comparisons.

Hinton expanded on the capacities of the Hopfield network and, borrowing heavily from statistical physics, constructed what he termed a Boltzmann machine utilizing an equation developed by the nineteenth-century physicist Ludwig Boltzmann. Hinton’s machine learned not from step-by-step programmed instruction but rather from trained examples. By looking at similarities in data, a Boltzmann machine can discern information, classify, and integrate new information with similar patterns and information already in its memory.

In Chemistry, Demis Hassabis and John M. Jumper shared half the 2024 Nobel Prize for their development of AI networks capable of predicting the three-dimensional structures of proteins by solving complex protein folding problems. David Baker was awarded the other half of the 2024 Prize for creating new proteins using AI programs.

Hassabis is CEO and cofounder of Google DeepMind, where Jumper serves as a director. Baker serves as a professor in biochemistry at the University of Washington.

Because of the relationship of structure to function, the ability to predict protein structure is a powerful tool for molecular biology, biophysics, and pharmacological and biomedical research.

Hassabis and Jumper’s AlphaFold, first unveiled in 2020, is capable of predicting the three-dimensional structure of a protein given the protein’s underlying sequence of amino acids, the building blocks of proteins. Increasing the impact of AlphaFold (updated to AlphaFold 3 in 2024), its source code and accompanying database were made open and available for other scientists to copy, use, and adapt.

Along with other AI-based predictive tools, Baker also produced an open-source AI tool called ProteinMPNN, first released in 2022. Baker’s Rosetta tools included programs designed to customize and reverse engineer proteins from a desired three-dimensional structure back to the amino acid sequence needed to produce a protein structure. Such customized proteins may allow scientists to potentially cure protein deficiencies and eliminate undesirable proteins with target-specific killer proteins. The fact that Nobel Prizes are often awarded for seminal discoveries made decades ago reveals the sweeping power and application of these recently developed protein-predictive AI tools. Protein structures that might have taken millions of years to evolve in accord with evolutionary mechanisms can now be deduced and created in hours and days. That compression of time can be essential in, for example, the development of new vaccines or treatments for disease caused by novel viruses. The tools can also dramatically reduce the development time for protein-based structures.

Leave a Comment