Hacker Newsnew | past | comments | ask | show | jobs | submit | pkAbstract's commentslogin

Exactly. The smaller bit widths from quantization might marginally decrease the compute required for each operation, but they do not reduce the overall volume of operations. So, the effect of quantization is generally more impactful on memory use than compute.


Except in this case they quantized both the parameters and the activations leading to decreased compute time too.


Kanye has done this. Multiple times I believe, but notably with different versions of "The Life of Pablo" on music streaming services.


These days, there are more audio plugins to add hissing and cracking than remove it; but iZotope RX is scarily good and the best noise removal plugin I've ever tried. ML has enabled a lot of pretty revolutionary capabilities in audio, songwriting, and music production over just the past couple years.


Agreed, small sample size; but in defense of the researchers, I don't think the general public appreciates how difficult it is to find research participants - nevermind a lot of them. And, ones that remain adherent to study protocol throughout it's entirety, so their data points can be included.

It's also extremely expensive, and unless you're a god at writing grant proposals, most uni research labs are cash poor.

There's a reason most studies don't have lots of participants, and in most cases it's not laziness on the part of the researchers.


Yes. It's asinine to run a large N study before having run a small one. So small N doesn't mean bad, it means preliminary.

Internet comment sections however are not great at nuance, and the difference between a bad study and a study leaving unresolved uncertainty is often beyond them.


Lidar would increase car cost so much that most people would die of sticker shock. The strongest candidate for proving lidar AV tech is Waymo, which to me means it has a relatively low chance in the consumer space in the 2020s; and unit cost needs to come down dramatically. The cost and complexity starts making more sense in vehicles with very high uptime.

In addition to being prohibitively expensive, lidar units are large, ugly, have moving parts (which should worry any engineer), and increase drag.

Tesla's current Autopilot tech is a bet on the promise of computer vision, and the intuition that if sight is good enough for humans, it's good enough for Autopilot. I think it's a smart bet.


That was true when Tesla wanted to start selling “fsd-ready” cars, but it’s getting cheaper all the time. Luminar has $500 lidar, Huawei says they’ll be selling $100 sensors soon.


I suppose what one considers the near future is relative.


You may be right, but it's also commonly believed in these communities that hardware is the part that's already been solved. Computer hardware already vastly outstrips human capacity in many domains.

To me, it seems more likely that we're missing something/some things on the software side. AGI could probably run on present day hardware or even older.


It probably will be, just extremely computationally expensive. But 30 years is also further out than when most experts predict the singularity will arrive; after which all bets are off.


As the neighbor just moved in, I'm guessing they were unaware of your friend's liver transplant. If this was indeed the case, I find your friend's situation difficult to sympathize with. .

If you did the slightest bit of digging, I think you'd find that 99% of people mean no offense, and would be deeply apologetic if their "transgression" were brought to their attention.

If you're sensitive enough, you'll find yourself offended all the time.


Quite a saga, and entertaining writing; but all of this was easily avoidable with a little bit of research.


The entire saga was almost entirely "research".


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: