More to do with neuroscience than you think. Fukushima took direct inspiration from Hubel & Wiesel's nobel prize in the 1960s when developing the neocognitron, which turned into convolutional neural networks. Hopfield networks are a model for associative memory. And, well, then there is the perceptron. There was always a link and mutual inspiration.
They're not identical but they are related. There's a series of approximations and simplifications you can go through to get from biological neurons to neural nets. Essentially the weights in the neural net end up corresponding to steady-state firing rates of populations of spiking neurons. See for example Chapter 7 of Dayan & Abbott's Theoretical Neuroscience.
Discussing the right levels of abstraction is a huge thing in computational biology. At what level is 'the algorithm' of natural computation implemented?
Except that the development of deep neural networks took direct inspiration biological neuroscience with neurons and synapses. Neural is even in the name. https://en.wikipedia.org/wiki/Deep_learning
DL did not take 'direct inspiration' from neurosciences. Maybe some ideas were borrowed such as the integrate-and-fire nature of neurons and Hebb's very vague rule, but those are very old ideas. Most of neuroscience research in past decades is in molecular biology , and particularly in the study of neural diseases (that's where all the funding goes). Learning and biological plasticity is notoriously complex and difficult to study, it's still very much undeciphered, and none of that plasticity research has made its way into ANN training.
In fact it is the reverse: the recent success of deep learning has sparked a race in neuroscience to try to find processes in the nervous that might mimic deep learning and in particular to build biologically plausible models about how the brain might implement gradient descent or more generally credit assignment.
People always repeat these stupid things like they're lore. Ok let's suppose this is true. What else is true is that neurology itself was inspired by phrenology and the practice of exorcisms. Should we now start recognizing and exalting those connections given how divorced modern (useful!) neurology is?
Hinton's most recent paper on forward-forward acknowledges Peter Dayan explicitly for his feedback on the paper, and cites a paper they cowrote together back in the 90s. Dayan being the author of the canonical textbook on theoretical neuroscience.
Major distinction given those practices have been abandoned as pseudo science or even worse, so they aren't fields of science continued to be developed which further useful connections might be found.
In psychiatry, there is a certain amount that we continue to study social standards of normalcy in other (including historic) societies to determine what should count as a mental disorder, but more to make sure we aren't doing a 21st century equivalent of labeling something as a demon possession because it contrasts with our current deeply held social norms.
So what is the meaning of to do with and nothing to do with? Inspiration seems to be a relationship.
Consider a different relationship between cellular biology and the Cells at Work anime. Clearly any relationship is unidirectional. Any cellular biology learns nothing from the anime, but the anime wouldn't exist without cellular biology.
Do we say the show has nothing to do with cellular biology? That doesn't seem right to me, given it depends upon it despite taking an amazing degree of artistic freedom.
The downvotes are very unusual to say the least. All the historical material on the subject unambiguously points to neural networks emerging from work done to formalize actual brain neurons. That formalism turns out not to be a great way to explain biological brains but the abstraction it provided proved highly effective for tasks like pattern recognition, classification, and decision making.
So much about computer science has been inspired from other fields such as biology. Polymorphism and object oriented programming, reification, neural networks and in particular convolutional neural networks, genetic algorithms...
If anything, it teaches the value in learning a topic and then applying it directly within computer science. The strength of computer science lies in its ability to adapt and incorporate ideas from other domains to push the boundaries of technology.
There are a lot of downvotes going around because a large contingent is thinking the Nobel Prize for "Physics" should not go to something involving Computer Science. That it was awarded as it was, was an error.
Seemingly because even if the math or algorithms came from a physicist solving physics problems . Since it didn't involve some theoretical particles, it isn't physics'y enough to get a Nobel in Physics.
At the very least neuroscience provides an "existence proof". Somehow this stuff must be possible using some sort of trained machine comprising a large number of simple components...