A lot of programming languages uses "Foo bar" during introduction without actually explaining why "Foo" and why "bar". Before the age of Google and Internet it was perhaps one of the most common question from speakers of non-English language.
This was one of the biggest hurdles I had to overcome when I was a wee lad combing through "Professional PHP Programming." All of the examples it gave were foo/bar, and I couldn't make the intellectual leap to understand what the real world use cases would be.
It wasn't until I tried building something (mad libs) that things "clicked"
Even many decades later I remember the frustration in university 100-level CS courses of every new concept demonstrated with a mess of foo(), int* bar, void** baz scribbled on a overhead projector.
Descriptive names are helpful, people! I think even back in the 90s C supported at least 31 characters.
>If you use an Apple silicon Mac I’m sure you have been impressed by its performance.
This article couldn't have come at a better time. Because frankly speaking I am not that impressed after I tested Omarchy Linux. Everything was snappy. It is like back to DOS or Windows 3.11 era. ( Not quite but close ) It makes me wonder why Mac couldn't be like that.
Apple Silicon is fast, no doubt about it. It isn't some benchmarks but even under emulation, compiling or other workload it is fast if not the fastest. So there are plenty of evidence it isn't benchmark specific which some people claims Apple is only fast on Geekbench. The problem is macOS is slow. And for whatever reason haven't improved much. I am hoping dropping support for x86 in next macOS meant they have time and excuses to do a lot of work on macOS under the hood. Especially with OOM and Paging.
I have a ThinkPad besides my main MacBook. I recently switched to KDE, a full desktop environment, and it is just insane how much faster everything renders than on macOS. And that's on a relatively underpowered integrated Ryzen GPU. Window dragging is butter smooth on a 120Hz screen, which I cannot say of macOS (though it outright terrible with the recent Electron issue).
Apple Silicon is awesome and was a game changer when it came out. Still very impressive that they have been able to keep the MacBook Air passively cooled since the first M1. But yeah, macOS is holding it back.
Modern machines are so insanely fast, the UI should almost react before you do an action.
My 5900X machine with relatively slow RAM running CachyOS actually almost feels as instant as DOS machines with incredible low latency.
Instead, we get tons of layers of abstraction (Electron etc), combined with endpoint security software in the enterprise world that bring any machine to its knees by inspecting every I/O action.
Zed is an awesome editor where every interaction is just “snappy”.
I feel all the animations and transitions have been introduced to macOS just to paper over slowness. I’ve disabled them via the accessibility settings which helps in perceived snappiness a bit, but it also makes yank visible where performance is not up to par.
Even when I have disabled everything on macOS it is still no where near the speed on CachyOS / Arch. I have been complaining about latency for more than 15 years and macOS are still no where as fast. Not to mention my original post was downvoted suggesting people disagree.
I think that is primarily due to the difference in assigned semantics of fast/speed.
Yes macOS on Apple hardware is crazy fast in terms of io and throughput. And the cpu is amazing, and it’s unified memory model allows for great things with LLM. But, the keypress to screen is still laggy. Especially compared to CachyOS.
I am thinking if this may finally be the missing piece for wide spread NAS. Instead of storing everything on the cloud, a AI Server ( NAS + AI ) will have all your data and runs the model locally. Files dont even need to be tagged or sorted, AI could find it all.
And Apple is best positioned to sell this "Server".
if ChartGPT was the iPhone moment for AI, then Claude Code may the iPhone 5 stage, and as the article points out there will be another explosion point likely the iPhone 6 Plus stage, where the usage really explode.
And to think we are limited by Compute Power even when we throw money at it, along with so many improvements in the pipeline and low hanging fruit. Scary to think what the future will be like, especially in terms of jobs. I have already seen some of our department cutting down 10-15% simply because something like dashboard now requires far less time to do.
HN often focuses on whether LLM cane take over or replaces the job of programmers, it is actually outside tech world which is getting hit the hardest.
>
if ChartGPT was the iPhone moment for AI, then Claude Code may the iPhone 5 stage, and as the article points out there will be another explosion point likely the iPhone 6 Plus stage, where the usage really explode.
Could anyone translate for someone who has never used an iPhone or followed particular versions of iPhone?
The Launch of iPhone sets off the Smartphone revolution. Although it might be better to call it computer in a pocket revolution. There were plenty of Smartphone before iPhone launch, Windows Mobile, Palm etc, but it was iPhone that made the difference and drove consumer to Smartphone era.
iPhone 5 ( and its Android counterpart era ) for many was the first Smartphone they use, we are talking about normal consumers, not the audience of HN. Even many of the so called professional YouTubers and Reviewers only have their first Smartphone during iPhone 4 - 5 era. Four years in many who thought it was hype, many who didn't understand why internet on your phone was a big thing, many who didn't know why you want an App. It also the first time iPhone was made with Supply Chain and operation in mind because previously supply could not keep up with demand. I believe it was first to break 100M yearly unit sold. iPhone 4 might have been luxury, but iPhone 5 was the time when majority wakes up and finally say we need smartphone. To put this in AI perspective, Claude Code is working, ( aka the PR commit percentage in the article ).
iPhone 6 Plus was the first iPhone with big screen at 5.5" ( 16:9 ), breaking 200M yearly sales, and finally a data points to the world is moving to big screen called phablet at the time. Smartphone is replacing Personal Computer, or your Smartphone is your PC replacing Desktop and Laptop. The Internet is happening on Phone, not PC. At this point it is just consistent growth, 200M unit sales as standard until around 2020 when things start to level off in 2020.
One of the them is a one shot project and one of them is an infrastructure project being used by billions. It makes a nice headline but not a decent comparison.
On another note I wish there is some break down of where all the spendings goes to. GPU, CPU, RAM, Network, Switch, Cables, Power, Datacenter. The AI ecosystem and business is larger than Smartphone revolution kick started by Apple, but unlike Smartphone or consumer products these Hyperscaler and AI spending right now it is very opaque.
The moon landing led to a huge amount of infrastructure and inventions which now power the modern world. The moon landing was the marketing moment. The buildout of space tech and creation of industry was the actual goal.
I have been using Safari Full time for ~13 years already, 2025 was the first time I felt Safari finally didn't have any quicks or rendering errors that Chrome and Firefox dont encounter.
It is still the slowest browser though if you have more than 30 Tabs. I hope they wont reduce resource on Safari developement and continue to work on it.
>At what point did the old Apple cross the threshold to "modern" Apple?
The simple answer would be when SJ passed away. The long answer is there wasn't a turning point, but a long period of cultural shift, due to Tim Cook being CEO.
Tim Cook not immediately taking a CEO stand and left a power vacuum was a mistake. He said himself he thought everything would continue as normal, which obviously did not happen. Firing Scott Forstall was a mistake. Ive taking over software design was a mistake. Not listening to the advice of Katie Cotton and manage a new PR direction was a mistake. Following Phill Schiller advice of firing long time Marketing Firm for Apple was a mistake. Tim Cook not understanding his weakness which is his judgement of character was a big big mistakes, as it leads to Dixon CEO and Burberry CEO taking helms of Apple Retail, ultimately stoping if not reversed the momentum of Apple Retails improvement and expansion by 10 years. Giving Ive the power to play around with Retail Design because Apple Retail Store is somehow a "social place" was a big mistake. Prioritising Operational and Supply Chain Decisions over Design was a mistake after around iPhone 8 Plus. Too focused on sales metric and bottom line was a big mistake. Shifting to Services Revenue, which should have been AppleCare, iCloud or even iPhone Subscription model, instead they got Apple TV+, in my option is a mistake. They were too scared to hurt the relationship with Carriers. Eddy Cue taking over a lot of decisions? Apple going to Davos? Merging of different iOS and macOS team where it used to be teams per product but later became functions per team structure. Trusting China and didn't diversify their production when Trump was first time in Office. ( They said they will but they didn't. Literally every single media lied on behalf of Apple ). I mean the list goes on and on.
I really like someone on HN said about Apple. Ever since Steve Jobs passed away Apple has been left on auto pilot mode for most of its time.
I wouldn't be surprised if it is closer to 10 Watts vs 100 Megawatts. We are 10 million times more efficient.
reply