Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
WithinReason
on Oct 4, 2024
|
parent
|
context
|
favorite
| on:
Were RNNs all we needed?
If you spent some time actually training networks you know that's not true, that's why batch norm, dropout, regularization is so successful. They don't increase the network's capacity (parameter count) but they increase its ability to learn.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: