Hacker Newsnew | past | comments | ask | show | jobs | submit | breezeTrowel's commentslogin

Despite the headline, this is not a toothbrush. This is a "toothbrush-shaped ultrasound transducer". Mind you, I don't know why this wouldn't "increase dentist margin". This is an analysis tool that makes dentistry easier (just like dental X-rays).


I think the biggest change with external defibrillators has been placement. It's now front and back instead of two on the front.


I just did a training course and for the ones we used it was still two on the front. Only for children it's front and back.


Likewise in the UK, two on the front, at least for adults. Makes less disruption to CPR if you leave the patient on their back.


I should redo my CPR then. Learned two on the front in high school in NJ. But also to read the instructions though I'm sure when seconds count you don't.


Modern AEDs have voice guidance telling the person what to do. So you can follow the instructions as you do it.

Also, you should call the emergency number in your region and (at least in Australia) they'll transfer you to someone who can coach you through using the defib and performing CPR until professional help arrives.

Don't let that stop anyone from getting their CPR up to date though. The more experience you have the better equipped you'll be if you need to use it


I see AEDs at work. If I have a heart attack, I have no confidence in my team being able to use it. I've seen how they handle requirements and documentation in stories.


> Not sure if we have time for learning CPR in the current sprint, let's put it in the backlog


was going to say, you need to make sure to open a ticket and bring it to the refinement meeting.


Well I thought it was one in the front and one close to the ribs


How is Llama a product? It's more like React, PyTorch, GraphQL, etc.


Meta’s AI offerings are products. They’re featured in Instagram, Facebook and Messenger.

They’re part of the Rayban partnership. Imho definitely a product.


Yes and Facebook uses React and GraphQL. Meta AI uses PyTorch. Does that make them all products?


No, because they’re not exposed to users as features that they interact with. From a user perspective, they don’t ever see if it was made using those tools. They see the AI front and center.

I feel like thats a pretty strong distinction.


They see AI features front and center. They dont see the backend (which may or may not be running Llama). Same goes for React although, with React, the framework is what the end user directly interacts with. So, by that logic, React is more of a product than Llama.


Let me flip this around.

If you are a user, what feature is react giving you? Pre and post react, Facebook is largely the same to a user.

Pre and post graphql, Facebook is also the same.

To the user, neither of those are expressed as features.

Llama / Meta AI is a feature that changes how a user interacts with the system. When it was added, it’s noticeable. If it were to be taken away it would be noticeable. Nobody was boycotting react.

That is the difference between an implementation detail and a product/feature.


I'm not sure I understand this analogy. If Meta changes what underlying model they use, it'll still be called Meta AI. Having the criteria be "what new features does it offer to the users of the site" means that things like databases, servers, etc. are not products. This is objectively false.


Regarding Kim Dotcom, the government allegetions aren't about what users do. You can read them here:

https://www.justice.gov/opa/pr/justice-department-charges-le...

Granted, he's moved on to being a Kremlin propagandist and is now shilling anti-Semitism. See:

https://x.com/KimDotcom/status/1825187568834753021


Yeah he has become a Russian shill and likes to boot lick Elon Musk now that he is as well. (Hell Elon is funded by Russia now!)


He wasn't having trouble with his WiFi since he was connected directly to the device. Although it might have helped to specify an Ethernet link. Still, chatbots are generally a terrible user experience.


Totally agree with you on chatbots, but you just committed the same "sin" as him! To most people, and probably to that chatbot, WiFi == Internet == Ethernet. Actually, I doubt specifying Ethernet would have helped him at all. When it asked him if he was having trouble with his WiFi, he should have just said yes, and then "spammed 0" if that didn't help. Or maybe he should have just immediately started spamming 0. Now that I think about, for anything more technical than unplugging the device, you should probably start with asking for a human.


Are you a rhinoceros?


Yes. That's why fundamental particles are called the "building blocks of the universe". It's buildings all the way down!


developers! of course!



Superconductors do not generate heat when in their superconducting state. Otherwise they wouldn't be superconductors.


Superconductors do not generate heat for a constant DC current. Computers are very, very AC, and you do get heat production anytime the current changes.

It is, IMO, a bit dubious whether or not anything is truly flowing in a superconductor at constant current. Electrons don't have identity, so the 'constant flow of electrons' can be rephrased as 'the physical system isn't changing'... and the degree to which you can tell that there are electrons moving about is also the degree to which the superconductor isn't truly zero-resistance.


You can tell electrons are flowing in a superconductor by measuring the magnetic field induced around them.


You can tell there's a magnetic field, certainly. My argument is essentially one of nomenclature; I don't feel a constant electron-field should count as 'flowing'.

Of course it isn't actually constant -- there are multiple electrons, and you can tell that the electron field is quantized. But the degree to which that is visible, is the exact degree to which the superconductor nevertheless doesn't superconduct!


Yea, but just keeping everything in the static superconducting state wouldn't be a computer either.


These are open weight models released under an Apache 2.0 license. There's nothing to buy.


IBM is a sales and services org.

Their customers aren’t going to build their own RAG and agent frameworks, vector DBs, data ingest pipelines, finetunes, high scale inference serving solutions, etc, etc.

There’s an incredible amount of stuff to buy.


Right, but they can just use Llama/Mistral for free, instead of their inferior models, which I'm sure take quite a bit of resources to train in the first place.


Yes but using someone else's models doesn't make them an "AI company".


Who is going to host them?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: