Hacker Newsnew | past | comments | ask | show | jobs | submit | xmddmx's commentslogin

> Sheldon Brown, a beloved iconoclast bicycle tech guru, died Sunday from a heart attack. He was 68 63.

Curious, what does "He was 68 63" mean. Is it a bicycle gear joke about his age at death?


Probably just a typo. He was 63.

Surprisingly young

On Mac Safari, holding shift and using the magic mouse to scroll up or down reverses the zoom direction.

This is both right (Shift-X is the reverse of X due to convention) But is also wrong (Shift-Scroll is the macOS gesture for scrolling on maps where Scroll alone doesn't zoom in or out).

TLDR: I really wish Apple would adopt the "scroll up to zoom in" convention used by the rest of the free world.


My memory is that it was named "Aero Glass" which heightens the irony of "Liquid Glass" sucking.

But I see many references to it being called just "Aero", but some call it "Aero Glass" [1]

Does anyone know the truth?

[1] https://www.pcmag.com/archive/rip-aero-glass-windows-8-stick...


Microsoft marketing maybhave had a specific preference, but "Aero" and "Aero Glass" are interchangeable. From the same article you linked:

> "Rest in peace, Aero. I liked you, a lot. Still do. And I'll miss you," Thurrott writes


Right, this annoyed me too - it was stated w/o attribution as if novel.

What is the name of the law when someone writes a think piece of "stuff I've learned" and fails to cite any of it to existing knowledge?

Makes me wonder if (A) they do know it's not their idea, but they are just cool with plagiarism or (B) they don't know it's not their idea.


I don't know if there's a named law, but the word for not knowing and believing that something remembered is a novel idea is "cryptomnesia".

Knowing that you know something by teaching is Feynman's method of understanding. Basically, on scanning, I don't particularly disagree with the content of the post. However, treating these things (many of which regularly show up here on HN) as being due to "14 years at Google" is a little misplaced.

But, hey, it's 2026, CES is starting, and the hyperbole will just keep rocketing up and out.


You mean 1990. Someone graduating college in 1990 would have been about 21. That was 35 years ago, so they would be about 56 in 2025.

Math is hard.


Weird flex of pedantry even for HN.

Says who? I did a gap year service project and graduated at age 23. My business partner did a 3-2 program and graduated at 23.

Plus, anyone working as an engineer then has a 8 figure net worth and the overwhelming majority moved on long ago.


Cmon man, it's a comment not a research paper. Off by one isn't worth a follow up snark


Off by 10+1. Someone who graduated college in 2000 = 25 + 22 (4 years of college from 18) = 47, not 57, and not anywhere close to the retirement age. It might be pedantry, but the original comment should have said 1990, not 2000.


Their main point was it is off by 10; then they introduced an additional confusing question of “is it off by 10, 11, or 12?”


Turns out that under the USA Code of Federal Regulations, there's a pretty big exemption to IRB for research on pedagogy:

CFR 46.104 (Exempt Research):

46.104.d.1 "Research, conducted in established or commonly accepted educational settings, that specifically involves normal educational practices that are not likely to adversely impact students' opportunity to learn required educational content or the assessment of educators who provide instruction. This includes most research on regular and special education instructional strategies, and research on the effectiveness of or the comparison among instructional techniques, curricula, or classroom management methods."

https://www.ecfr.gov/current/title-45/subtitle-A/subchapter-...

So while this may have been a dick move by the instructors, it was probably legal.


I'm afraid you misunderstand what it means to be "exempt" under the IRB. It doesn't mean "you don't have to talk to the IRB", it means "there's a little less oversight but you still need to file all the paperwork". Here's one university's explanation[1]:

> Exempt human subjects research is a specific sub-set of “research involving human subjects” that does not require ongoing IRB oversight. Research can qualify for an exemption if it is no more than minimal risk and all of the research procedures fit within one or more of the exemption categories in the federal IRB regulations. *Studies that qualify for exemption must be submitted to the IRB for review before starting the research. Pursuant to NU policy, investigators do not make their own determination as to whether a research study qualifies for an exemption — the IRB issues exemption determinations.* There is not a separate IRB application form for studies that could qualify for exemption – the appropriate protocol template for human subjects research should be filled out and submitted to the IRB in the eIRB+ system.

Most of my research is in CS Education, and I have often been able to get my studies under the Exempt status. This makes my life easier, but it's still a long arduous paperwork process. Often there are a few rounds to get the protocol right. I usually have to plan studies a whole semester in advance. The IRB does NOT like it when you decide, "Hey I just realized I collected a bunch of data, I wonder what I can do with it?" They want you to have a plan going in.

[1] https://irb.northwestern.edu/submitting-to-the-irb/types-of-...


The CFR is pretty clear, and I have experience with this (being both an IRB reviewer, faculty member, and researcher). When it says "is exempt" it means "is exempt".

Imagine otherwise: a teacher who wants change their final exam from a 50 item Scantron using A-D choices, to a 50 item Scantron using A-E choices, because they think having 5 choices per item is better than 4, would need to ask for IRB approval. That's not feasible, and is not what happens in the real world of academia.

It is true that local IRBs may try to add additional rules, but the NU policy you quote talks about "studies". Most IRBs would disagree that "professor playing around with grading procedures and policies" constitutes a "study".

It would be presumed exempted.

Are you a teacher or a student? If you are a teacher, you have wide latitude that a student researcher does not.

Also, if you are a teacher, doing "research about your teaching style", that's exempted.

By contrast, if you are a student, or a teacher "doing research" that's probably not exempt and must go through IRB.


You would be correct, except that this is a published blog post. It may not be in an academic journal, but this person has still conducted human subjects research that led to a published artifact. It was just "playing around" until they started posting their students' (summarized, anonymized) data to the internet.


I was impressed by the lack of dominance of Thunderbolt:

"Next I tested llama.cpp running AI models over 2.5 gigabit Ethernet versus Thunderbolt 5"

Results from that graph showed only a ~10% benefit from TB5 vs. Ethernet.

Note: The M3 studios support 10Gbps ethernet, but that wasn't tested. Instead it was tested using 2.5Gbps ethernet.

If 2.5G ethernet was only 10% slower than TB, how would 10G Ethernet have fared?

Also, TB5 has to be wired so that every CPU is connected to every other over TB, limiting you to 4 macs.

By comparison, with Ethernet, you could use a hub & spoke configuration with a Ethernet switch, theoretically letting you use more than 4 CPUs.


10G Ethernet would only marginally speed things up based on past experience with llama RPC; latency is much more helpful but still, diminishing returns with that layer split.


This Video tests the setup using 10Gbps ethernet: https://www.youtube.com/watch?v=4l4UWZGxvoc


That’s llama, which didn’t scale nearly as well in the tests. Assumedly because it’s not optimized yet.

RDMA is always going to have lower overhead than Ethernet isn’t it?


Possibly RDMA over thunderbolt. But for RoCE (RDMA over converged Ethernet) obviously not because it's sitting on top of Ethernet. Now that could still have a higher throughput when you factor in CPU time to run custom protocols that smart NICs could just DMA instead, but the overhead is still definitively higher


what do you think "ethernet's overhead" is?


Header and FCS, interpacket gap, and preamble. What do you think "Ethernet overhead" is?


I've meant in usec, sorry if that wasn't clear, given that the discussion that I've replied was about rpc latency.


That's a very nebulous metric. Usec of overhead depends on a lot of runtime things and a lot of hardware options and design that I'm just not privy to


I really hope the user was running Time Machine - in default settings, Time Machine does hourly snapshot backups of your whole Mac. Restoring is super easy.


There is something wrong with this article, possibly just copyediting mistakes but it makes me question the whole thing.

For example, check out this mess:

> “Unfortunately, there is one significant issue with the aforementioned data: schooling. Seeing as the majority of work to date includes only aggregate data, it is impossible to account. The first concerns small N: seeing as most publish studies only include a handful of TRA data, there is a lot of room for error and over.

Unfortunately, there is a largely unaccounted for confound in this aggregate data which may make generalized analysis questionable: schooling.”


Good catch. Additionally, one of the authors on this is just a student at UWisc, and the other author is also not a professional researcher but instead an author of popular books.

This is not an ad-hominum, but does put into question the statistical training backgrounds of both of these authors to accurate assess the data.


If not ad-hominum, what is it then? I mean, you did not provide any substantiated reason why would their research be false but you went straight to pin-point their experience, or lack thereof.

FWIW I find this research to align on my thoughts about the IQ - IQ is not a constant but a function of multiple variables, where one of the variables is most likely an education.

For instance, I am pretty sure that drilling through the abstract mathematical and hard engineering problems to some extent during the high-school but much more during and after the University, develops your brain in such a way that you become not only more knowledgeable in terms of memorizing things but develops your brain so that it can reason about and anticipate things that you couldn't possibly do before.


> but does put into question the statistical training backgrounds

This is true of virtually all university research. Statistics is far more nuanced than what a semester course can teach you. And the incentives to publish can cause bad actors to use poorly defined surveys or p hack or whatever.


> and the other author is also not a professional researcher but instead an author of popular books.

This makes the awkward wording even more confusing. I don't understand how a professional author who appears to speak English very well would write so poorly and not follow up with edits.


The language is consistent with ESL writing, in my experience.

The strange thing is that the corresponding author and the co-author appear to be english speakers, as far as I can tell. I googled the primary author and found a YouTube channel where someone by the same name speaks clearly about neuroscience. Maybe I'm looking at another person with the same name and middle initial who also happens to speak about neuroscience and brain development?


Why not do an empirical A/B test: Set up two honeypots (or perhaps 2000 for statistical significance). A gets zero updates, B gets all updates immediately. See which ones get pwned faster.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: