That's silly. Programmers are being brow-beaten into radically changing their programming style to accommodate a drastic 90-degree turn in the evolution of hardware, from general-purpose serial to parallel and SIMD. Look at this rasterizer, for instance. It's comically inefficient on most hardware of the last decade. If this keeps up, optimal serial algorithms are going to become just academic ideals like Turing machines, never to be implemented.
You could say there's a back-and-forth between hardware and software, but to me it looks like software is playing catchup. "Oh I've got 16-wide vectors now? Ok let me go back and completely change everything." "What, I can't load a vector from an odd-numbered array position? Hmm, there goes THAT algorithm." The electrical engineers are running us.
The silicon just won't go any faster. What do you want the electrical engineers to do? Moore got you hooked on his junk and now his supply's been cut off. People tried to switch you to something more sustainable, i.e. anything but x86 and C, but you couldn't wait for your fix.
A real hacker should be happy that there are fresh new problems to solve. Or, if you just want to get shit done, use a dynamic language and be glad that CPU speed is rarely an issue any more.
You could say there's a back-and-forth between hardware and software, but to me it looks like software is playing catchup. "Oh I've got 16-wide vectors now? Ok let me go back and completely change everything." "What, I can't load a vector from an odd-numbered array position? Hmm, there goes THAT algorithm." The electrical engineers are running us.