“These artificial neural networks have been used to advance research across physics topics as diverse as particle physics, materials science, and astrophysics,” Ellen Moons, chair of the Nobel Committee for Physics, said at a press conference this morning.
The landmark Deep Belief Networks (stacked RBMs) paper in Science was in 2006 [1]. DBNs were completely obsolete quite quickly, but don't deny the immense influence of this line of research. It has over 23k citations, and was my introduction to deep learning, for one. And cited by the Nobel committee.
You're completely incorrect to say RBMs were of theoretical interest only. They have had plenty of practical use in computer vision/image modelling up to at least a few years ago (I haven't followed them since). Remember the first generative models of human faces?
Edit: Wow, Hinton is still pushing forward the state of the art on RBMs for image modelling, and I am impressed with how much they've improved in the last ~5 years. Nowhere near diffusion models, sure, but "reasonably good". [2]
[1] G.E. Hinton and R. Salakhutdinov, 2006, Science. "Reducing the Dimensionality of Data with Neural Networks"
“These artificial neural networks have been used to advance research across physics topics as diverse as particle physics, materials science, and astrophysics,” Ellen Moons, chair of the Nobel Committee for Physics, said at a press conference this morning.