Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a physicist, my reaction to this is how bizarre is that. Maybe he deserves a nobel prize but in physics?

Also arguing that NN is used in physics so we can argue nobel prize is okay is like asking for Stephan Wolfram to be awarded Nobel prize for Mathematica which is much more used in physics as a tool. And he is a physicist and had contributions to the field of numerical relativity (The reason he created Mathematica in the first place).

The royal science academy fucked up so much with this choice.



By this definition Claude Shannon (the father of Information Theory) clearly deserves a Nobel in Physics. The central concept in Information Theory is Entropy which is defined literally the same way as in Physics. And Shannon's Information Theory clearly revolutionized our life (tele-communications) much more than Hopfield network or Hinton's Boltzmann machine.


Fun historical fact: Claude Shannon did win the Noble prize

https://en.wikipedia.org/wiki/Alfred_Noble_Prize#Recipients


In 1939 Claude Shannon won the "wrong" Nobel prize -- The Alfred Noble Prize award presented by the American Society of Civil Engineers [0]. It causes a lot of confusion. Claude Shannon never won a "real Nobel".

[0] https://en.wikipedia.org/wiki/Alfred_Noble_Prize

EDIT: typos


That's different local prize by the American society of civil engineers.

https://en.wikipedia.org/wiki/Alfred_Noble_Prize


Yes. That's the joke.


[party pooping] It would have been better delivery if you said "a Nobel prize" instead of "the".


Noble != Nobel


Ah my bad.


Still more of a Nobel Prize than the one for Economics.


It feels a bit like the field of physics claiming the invention of AI, where we all know that mathematics and/or CS deserve the honor.


Someone changed the Wikipedia article today to call Hopfield a "physicist". Previously the article called him simply a scientist, because his main work wasn't limited to physics. I changed it back now, let's see if it holds up.


It’s ‘physicist’ again now.


User "ReyHahn" has changed it back to physicist. Justification: "he defines himself as physicist but he has worked i mnay fields"

https://en.wikipedia.org/w/index.php?title=John_Hopfield&dif...


To use the local lingo…

[citation needed]

I suppose some might argue that being awarded the Nobel Prize in Physics is enough to call yourself a physicist.

…it does have the unfortunate implication, however, that nominations need not be restricted to physicists at all since any winner becomes a physicist upon receipt of the prize.

It’s sort of like the No True Scotsman but inverted, and with physicists instead of Scotsmen.


The Nobel Committee doesn’t represent the field of physics. I talked to a few former colleagues (theoretical physicists) just now and every one of them found this bizarre.


I think the Nobel prize doesn't want any scientific advance to fall outside the range of awards entirely.


>where we all know that mathematics and/or CS deserve the honor

Or semiconductor manufacturers.

All the math and CS needed for AI can fit on a napkin, and had been known for 200+ years. It's the extreme scaling enabled by semiconductor science that really makes the difference.


That's absurd. The computer science needed for AI has not been known for 200 years. For example, transformers were only invented in 2017, diffusion models in 2015.

(When the required math was invented is a different question, but I doubt all of it was known 200 years ago.)


TBF backpropagation was introduced only in the 1970's, although in hindsight it's a quite trivial application of the chain rule.

There were also plenty of "hacks" involved to make the networks scale such as dropout regularization, batch normalization, semi-linear activation functions (e.g. ReLU) and adaptive stochastic gradient descent methods.

The maths for basic NNs is really simple but the practice of them is really messy.


Residual connections are also worth mentioning as an extremely ubiquitous adaptation, one will be hard-pressed to find a modern architecture that doesn't use those at least to some extent, to the point where the original Resnet paper sits at over 200k citations according to google scholar[1].

[1] https://scholar.google.com/citations?view_op=view_citation&h...


Highway nets introduced them in the 90s


> All the math and CS needed for AI can fit on a napkin, and had been known for 200+ years.

This isn't really true. If you read a physics textbook from the early 1900s, they didn't really have multivariate calculus and linear algebra expressed as concisely as we do now. It would take several napkins. Plus, statistical mechanics was quite rudimentary, which is important for probability theory.


> If you read a physics textbook from the early 1900s, they didn't really have multivariate calculus and linear algebra expressed as concisely as we do now.

This is completely incorrect.


I don’t think calculus existed at sufficient rigor 200 years ago.

Computer science wasn’t even a thing 100 years ago.


Calculus has been around for quite some time.

If Newton had the machinery to fit large models to data, he would have done so. No doubt.


Cauchy’s main work was under 200 years ago; and there’s been quite a lot of work since.

Again, I’m unsure that calculus existed at sufficient level 200 years ago — it didn’t appear in modern form from either Leibniz or Newton.


Right; the Nobel Committee has officially "jumped the shark".

Reminds me of the old classic Physics and Politics by Walter Bagehot.


> The royal science academy fucked up so much with this choice.

Well, you’re talking about it.


> Maybe he deserves a nobel prize but in physics?

Which category fits better?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: