Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
PaulHoule
on June 22, 2023
|
parent
|
context
|
favorite
| on:
Any Deep ReLU Network Is Shallow
It doesn’t surprise me. It’s been known a long time that you can model arbitrary functions with a 3-layer network with logistic activation.
RobbieGM
on June 22, 2023
[–]
Do you have any further information or a source for this? As someone unfamiliar with ML, this sounds crazy to me.
hazrmard
on June 22, 2023
|
parent
[–]
It was first proposed here[1]: "Approximation by superpositions of a sigmoidal function"
[1]:
https://link.springer.com/content/pdf/10.1007/BF02551274.pdf
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: