I agree with some of their sentiment but disagree that it is secularism specifically.
If anything, my observation has been that social media provides better avenues for exploitation by bad actors and, for lack of a better term, people unwilling to do 'self work'.
It used to be a lot harder to 'grift'; historically, a community would eventually suss out bad actors which leads to shunning/etc.
But, when your 'community' is an entire country or a large area of the planet, the signal/noise ratio changes along with the behavior of the bad actors.
As an example not directly related to Amazon, I've worked with more than one person who would be a decent programmer if they worked on their job skills as much as they worked on their job hopping skills; online job posting (at least for a while) made it way easier for someone to just hop from job to job collecting a paycheck before the 'well now they should be onboarded and productive' red line is crossed and they are found out.
I've seen it with more than one person that is happy to screw over multiple 'friends' because they just use the internet to find the right groups to make new friends [0].
I've seen it with acquaintances where they just keep burning through 'matches' on dating sites without any introspection as to their own toxic behavior[1].
And sure, in all these cases people bad actors can still get 'outed'. However the bad actors are also happy to be dishonest in their own messaging, which again messes with the SNR. They'll just try to drag you through the mud and drain your endurance fighting their lies if you try to speak up, and unless you've really got time to burn... everyone stays quiet.
And, well, society is worse as a result.
[0] - They'll even pick up new interests in the process, once they've sufficiently burned themselves in a given community.
[1] - My favorite example was two narcissists that -both- were looking to replace the other before they broke up with each other...
I can't even get Github Copilot's plugin to avoid randomly trashing files with a Zero No width break space at the beginning, let alone follow formatting rules consistently...
I am the last person to say anything good about CoPilot. I used CoPilot for a minute, mostly used raw ChatGPT until last month and now use Codex with my personal subscription to ChatGPT and my personal but company reimbursed subscription to Claude.
Yep, to the extent that short (at best, cause they are potentially fallible) of a warrant canary getting snuffled it is very possible that a company could set up a subsidiary for appearances.
Or, just buy bits of control interest outright (CryptoAG?)
Still more predictable than GPU buys in the current climate. Power connector melting aside, GPUs in most cases get replaced less frequently than cell phones, unless of course you have lots of capital/profit infusion to for whatever reason stay ahead of the game.
Heck if Apple wanted to be super cheeky, they could probably still pivot on the reserved capacity to do something useful (e.x. revised older design for whatever node they reserved where they can get more chips/wafer for cheaper models.)
NVDA on the other hand is burning a lot of good-will in their consumer space, and if a competitor somehow is able to outdo them it could be catastrophic.
Yea, it’s anecdata, but I only replaced my 1080 ti about 1.5 years ago.
Graphical fidelity is at the point that unless some new technology comes out to take advantage of GPUs, I don’t see myself ever upgrading the part. Only replacing it whenever it dies.
And that 1080 ti isn’t dead either, I passed the rig onto someone who wanted to get into PC gaming and it’s still going strong. I mostly upgraded because I needed more ram and my motherboards empty slots were full of drywall dust.
The phone I’m more liable to upgrade solely due to battery life degradation.
I replaced my 1080 Ti recently too (early 2025). I had kept it as my daily GPU since 2017. It was still viable and not in urgent need of a replacement, even though my 1080 Ti is an AIO liquid cooled model from EVGA, so I'm surprised it hasn't leaked yet. It's been put through a lot of stress from overclocking too, and now it lives on inside a homelab server.
The 5090 I replaced it with has not been entirely worth it. Expensive GPUs for gaming have had more diminishing returns on improving the gaming experience than ever before, at least in my lifetime.
286 protected mode gave you 16 MB of memory space, but it still had mandatory segments. Usable but it needed hard work and careful thought.
IIRC, that is why the 286 version of MCW Coherent OS still only allowed a maximum program size of 64 kB of code -- plus 64 kB of data. Very limiting. OK for early-Unix-style (say up to UNIX V6, of Lions Book fame and "you are not expected to understand this" comment.)
386 protected mode gave you 4GB of space, and you could choose the segment size, which meant everyone chose a segment size of ALL THE RAM PLS KTHXBYE. That meant you got flat addressing: all pointers could be flat 32-bit INTs. Simples!
The 386 also added the ability to seamlessly jump from 8086 mode to 386 mode and back.
The 286 intended that to be a one-way trip. DR and others found a loophole (IIRC the `LOADALL` instruction) and a way to exit 286 mode again but it forced a full CPU reset, so it was slow.
Not only did the 386 make it easy to get in and out of 8086 mode for DOS binaries, it natively supported multiple concurrent hardware-assisted 8086 VMs, plus memory management so you could keep most of DOS's 640kB free. So, win for protected-mode code, win for DOS code.
Intel aimed to please the DOS folks, and to alleviate jealousy of the 680x0 and its nice flat memory space, and make it easy to port UNIX and other memory-managed OSes over from minis and workstations.
TL;DR
The 286 protect mode was meant for 286 native OSes and it was not easy to run 8086 stuff, or change modes. Plus, segments, which everyone hated.
The 386 protect mode optionally got rid of all the stuff that people disliked about 16-bit x86 and offered alternatives that rivalled other platforms than x86.
Biggest change until x86-64 came along in 2001-2002 or so.
- In May 1990, Windows 3.0 running on a Intel 80386 could run multiple DOS VMs simultaneously
- In April 1992, OS/2 2.0 finally supported multiple DOS VMs. OS/2 1.x supported running only one DOS app at a time.
While MS Excel had been available on Windows for many years before Win3.0, Word for Windows had only been released a year before Win3.0.
So many PC users would still have been using the DOS version of WordPerfect (and maybe the DOS version of Lotus 123). Windows 3.0 allowed the user to run many of their DOS apps at the same time, which DOS and OS/2 couldn't do.
Windows/386 could do that in 1987, but not many people bought it. Power users had DESQview/386 or something similar.
If IBM had listened to Microsoft, OS/2 1.0 could have done that in 1987 too... and then I think OS/2 could have been a hit and Windows 3.0 might never have happened.
C# has been favored by a lot of game devs for some time. You've got Godot, Unity, I think you can do -some- things in unreal engine with C#...
In contrast to java it has added a lot of helpful constructs for high performance code like game dev; things like `Span` and `Memory` plus ref structs make it easier to produce code that avoids allocation on the heap (and thus lower GC pauses, which are a concern for most types of game dev).
At least for now I'd rather trust Microsoft than Oracle, esp since both CoreCLR and Mono are under more permissive licenses than Java and have been for some time.
I don't think you are off base FWIW. Unity has long both kinda lagged too hard but also just makes things weird at times.
I think the biggest hurdle a unity contender has to overcome, is how to provide both 'similar enough' primitives as well, as a way to easily consistently handle the graphics/sound pipeline (i.e. simple way to handle different platform bindings) while also making sure all of that infrastructure is AOT friendly.
Capcom doing their own env is still a bit extreme sounding to me in it's own right (with the shitpost comment of, I bet they have a slick dispatcher involved somewhere vs 'trust the threadpool')
But then I remember they have to deal with IL2CPP, because lots of mobile/console platforms do not allow JIT as policy.
.NET does now have 'full AOT' as a thing at least, and Streets of Rage 4 I believe used CoreRT for at least a one of the platforms it was released for.
What's more hopeful about that, is that you can see backers of many different architectures contributing towards the ecosystem.
I'd be willing to give you access to the experiment I mentioned in a separate reply (have a github repo), as far as the output that you can get for a complex app buildout.
Will admit It's not great (probably not even good) but it definitely has throughput despite my absolute lack of caring that much [0]. Once I get past a certain stage I am thinking of doing an A-B test where I take an earlier commit and try again while paying more attention... (But I at least want to get where there is a full suite of UOW cases before I do that, for comparison's sake.)
> Those twenty engineers must not have produced much.
I've been considered a 'very fast' engineer at most shops (e.x. at multiple shops, stories assigned to me would have a <1 multiplier for points[1])
20 is a bit bloated, unless we are talking about WITCH tier. I definitely can get done in 2-3 hours what could take me a day. I say it that way because at best it's 1-2 hours but other times it's longer, some folks remember the 'best' rather than median.
[0] - It started as 'prompt only', although after a certain point I did start being more aggressive with personal edits.
[1] - IDK why they did it that way instead of capacity, OTOH that saved me when it came to being assigned Manual Testing stories...
If anything, my observation has been that social media provides better avenues for exploitation by bad actors and, for lack of a better term, people unwilling to do 'self work'.
It used to be a lot harder to 'grift'; historically, a community would eventually suss out bad actors which leads to shunning/etc.
But, when your 'community' is an entire country or a large area of the planet, the signal/noise ratio changes along with the behavior of the bad actors.
As an example not directly related to Amazon, I've worked with more than one person who would be a decent programmer if they worked on their job skills as much as they worked on their job hopping skills; online job posting (at least for a while) made it way easier for someone to just hop from job to job collecting a paycheck before the 'well now they should be onboarded and productive' red line is crossed and they are found out.
I've seen it with more than one person that is happy to screw over multiple 'friends' because they just use the internet to find the right groups to make new friends [0].
I've seen it with acquaintances where they just keep burning through 'matches' on dating sites without any introspection as to their own toxic behavior[1].
And sure, in all these cases people bad actors can still get 'outed'. However the bad actors are also happy to be dishonest in their own messaging, which again messes with the SNR. They'll just try to drag you through the mud and drain your endurance fighting their lies if you try to speak up, and unless you've really got time to burn... everyone stays quiet.
And, well, society is worse as a result.
[0] - They'll even pick up new interests in the process, once they've sufficiently burned themselves in a given community.
[1] - My favorite example was two narcissists that -both- were looking to replace the other before they broke up with each other...
reply