Hacker Newsnew | past | comments | ask | show | jobs | submit | zokier's commentslogin

how about comparing it to something sensible like osquery instead of doing silly strawman ps pipelines

> Among other problems, UTC runs slightly slower or faster depending on how far the Earth is from the Sun. UTC does not run uniformly (apart from Earth-at-sealevel), instead the length of 1 second will slightly grow or shrink depending on the current configuration of the Solar system.

That is completely wrong. UTC seconds are exactly SI seconds, which are all the same uniform length (defined by quorum of atomic clocks).


At sea level on Earth, UTC seconds are all exactly the same, yes. That's the definition of UTC.

The trick is, when you're working with things on the scale of the Solar system and larger, it no longer makes sense to assume your frame is "at sea level on Earth." Your relativistic reference frame has shifted, so (thanks Einstein!) time literally changes underneath your feet.

The primary mechanism (but not the only one) is that a clock on Earth will tick slower when Earth is closer to the Sun, due to the effects of gravitational time dilation.[0]

So yes, a clock on Earth always runs at a uniform rate. But because the universe is fundamentally Einsteinian, that still means that eg if you're working with the orbit of Jupiter or a millisecond pulsar, you will see small introduced timing errors if you try to use UTC (or even UT1 which is UTC without leap seconds) instead of TBD.

[0] https://en.wikipedia.org/wiki/Gravitational_time_dilation


But it's all relative, all reference frames are different and relative from each other and there is no one reference frame that is somehow special. TBD runs unevenly relative to UTC as much as UTC runs unevenly relative to TBD.

Absolutely, I completely agree. Einstein has taught you well. ;) The only thing that matters is choosing the right reference frame for the job.

When high accuracy is required, UTC is not the right reference frame for the job of astronomical calculations. That's all.


I'm not mac dev but wasn't apple all in on objc back then and these days it's more swift? that is pretty big shift, I'd assume for the better for most parts.

I prefer Swift as a language, but Apple's developer documentation back then was clear, detailed, and overall excellent. Occasionally I felt like I was reading a classic CS text rather than a manual. I could always find the guide on the particular facet I was looking for within a few clicks.

I think also that Snow Leopard era (unibody) MacBook Pro design was peak Mac. It was really full-featured while also having clean intentional design.

Tiger on a G4 tibook was peak apple.

But red delicious is still definitely apple. I can definitely understand why someone would not prefer red delicious apples or iceberg lettuce, but that is quite different from claiming that iceberg lettuce isn't really a lettuce.

Which is why I put (facetiously) in my original comment. I'm perfectly aware that iceberg is really lettuce, and what I'm doing is making a "no true Scotsman" argument. It's all for fun. If I was being serious, I would define a new category, let's call it "tasty lettuce", which is a subset of lettuce and which iceberg should not be included in IMHO. But it's more fun to say something obviously incorrect but which conveys my meaning in ordinary conversation.

> Current game devs, especially in the AAA space, spend a lot of time and effort looking for hyper realism and embracing new tech to achieve accurate PBR. I wonder whether the limitations of the older hardware force a more artistic stance on everyone, even down to technical artists, to embrace an art style and art direction and work to achieve attractiveness vs realism. Or I could just be seeing my early 20’s through rose-tinted glasses

I'd argue that during Max Paynes time (early to mid 00s) gaming was far more graphics tech driven than these days, especially on PC. It is far more common to see heavily stylized or non-photorealistic games these days than back then imho. When I think early 00's PC games, lot of it is shooter games pushing the tech envelope very heavily, stuff like HL2, Far Cry, Doom 3 etc, and I don't think we really see that sort of games often these days anymore.


Early to mid 00's was peak Moore's Law. Every year left last year's computers on the curb. Graphics hardware acceleration and programmable shaders were expanding their capabilities in ungainly leaps and bounds. Every new piece of popular hardware wasn't just the marketing number going up, it was Christmas for software. You didn't need 100+ devs to compete in the big leagues. And yeah there were a lot of moody FPS games, but I can't describe the delight of first seeing a Katamari Damacy or a Portal. It was a time of untilled earth and unplucked fruit.

I remember the Geforce 3 and then Geforce 4 ti whatever. It curb-stomeped my Geforce2 MX.

For comparison, it was like stepping from a bit upgraded XBOX graphics to the ones from an XBOX 360 in two or three years.

For GenZ-ers: kinda like gap from the PS3/ (and high end PSP games even) to the first games for the PS5 in two years. Insane.


> I haven’t had anti-malware software on Windows for over 10 years.

Unless you have gone through hoops to remove it, you most likely were still running Windows Defender Antivirus.


Fair. I was thinking of third party anti-malware that I have to go out and get, not something that is tightly integrated into Windows itself and enabled by default.

Also, I don’t need the antivirus portion of Defender since I don’t click on executables or install apps willy nilly. It’s very light and silent so I wouldn’t go out of my way to turn it off. Plus the firewall is also part of Defender.


The thing that is unintuitive to me is the timeline and scale. The age of universe is 13.8B years and age of Earth is 4.5B years. And yet Earth has many of these elements in abundance which are produced by complex chains and in trace quantities. Like the elements need first to be produced in stars, then ejected out, then accumulated into protoplanetary dust, then aggregated into planets. It feels wild to me that the process took only twice as long as what Earth has existed.

Much like there is a water cycle on earth, we are discovering there are element transportation cycles in galaxies. Anton Petrov did an episode on this. https://www.youtube.com/watch?v=SjToE8XJaL4

The early star were huge and exploded extremely fast, like a few million years. It's likely they did this in rapid succession many times priming the universe with a lot of building blocks. The early universe was wildly energetic.


This is another thing that feeds into the Fermi paradox. Previous generations of stars and planets might be too low metallicity to give rise to very complex intelligent life. We might be part of the first crop to evolve as the metallicity of the cosmos reaches a threshold.

Life on Earth is mostly C, H, O, and N, but it makes use of many heavier elements to conduct complex chemical synthesis processes. Some are only used in trace amounts but are still necessary. Then there’s technology which could not have developed to this level without most of the periodic table. Low metallicity is likely to put a ceiling on what can evolve.

You’re not getting spacefaring aliens until you have the building blocks. Then it takes billions of years, and on top of that stable nurseries like Earth are probably rare.

So TL;DR my guess is that we are early and rare.

In a few billion years the galaxy might resemble Star Wars with aliens all over the place, albeit without FTL unless we are very wrong about core physics or there’s some huge aspect of reality we haven’t found yet.


I don’t buy this hypothesis, because we’ve had complex life on land for 250 million years. Evolution is not a steady upwards path (especially when you take into account mass extinctions). There’s no reason an intelligent species couldn’t have evolved on Earth any time in the mesozoic. Just a single million year head start would be huge for a civilization.

Seems highly unlikely that the resolution to the Fermi paradox is just that we’re the first intelligent species in the galaxy.


I agree. My own suspicion for the resolution of the fermi paradox is that there are 4 steps that we have seen occur only once (or zero times) so it is difficult to measure their probability. If some of those probabilities are significantly lower than we think it could resolve the "paradox".

Those 4 steps are: (1) Formation of life (2) Formation of multicellular life (3) Formation of "runaway intelligence", where a species evolves enough intelligence to manipulate their environment in ways that supercede evolution. Parrots and dolphins, for instance, are both quite smart, but don't look close to getting a runaway benefit from that which would allow them to invent technology like agriculture and overpower all other species. (4) Technical civilization manages to manipulate their planet or spread beyond one star system in a way that would be detectable from elsewhere in the galaxy. (Hasn't happened yet.)


I don’t think these hypotheses are mutually exclusive. Low metallicity in the past might explain how we could be first even though the galaxy is very old. The conditions did not exist until recently, on cosmic time scales. If the probabilities are also low then you’re not going to get many even when conditions are right.

There is a sampling bias. Earth-like planets are too small to retain hydrogen and helium, so even if the proportion of heavier elements in the initial gas cloud were 10 times lower, the planets would still be rich in heavy elements.

Peak brightness is most likely to suffer.

Not creep, but thermal expansion is definitely noticeable with interferometry: https://www.youtube.com/watch?v=vupIq4epCQA

yeah, even breathing nearby this thing (or putting a soldering iron near the paths) will show a visible change!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: