The actual process of computation, sure, but machine learning was born from physics-based methods and applications to understand complexity and disorder.
I think ML depended on math/statistics/computing much more than physics. It's much easier to see what Hopfield and Hinton did as inspired by mathematical models that were created to help study statistical mechanics. Which is fine and just shows how stochastic scientific discovery is.
Parisi won in 2021, not last year. His work was more about establishing spin glasses as a way to study complex systems. Hopfield definitely built on that, showing how those ideas could be applied to neural networks and info storage in state-space machines.
As for focusing on Hopfield networks and Boltzmann machines, I get where you're coming from. They’re just a couple of architectures among many, but they’re pretty foundational. They’re deeply rooted in statistical mechanics and have had a huge impact, finding applications across a range of fields beyond just machine learning.