1. Overproduction. Even liberal arts colleges have 15-20% of students majoring in computer science. “Learn to code” ceases to be good advice if too many people do.
2. AI… sort of. It’s a lousy replacement for serious engineering talent, but the bosses are so enticed by reduced labor costs (and reduced employee leverage) that they will keep trying even if the stuff doesn’t work. Expectations are going up, teams are shrinking, and junior roles are vanishing.
3. Reputation collapse. Remember how we dismissed Michael O. Church as a crank? His writing style was grating (and has improved immensely) but he was right about everything, five years before anyone else. In 2009, we were “good rich people” in contrast to Wall Street. Now we’re Public Enemy #1 and, while we don’t all deserve it, our industry’s leadership does. This doesn’t hurt big tech companies because they’re invincible monopolies, but it has ended the era in which even non-tevh companies wanted three or four “data scientists.”
> “Learn to code” ceases to be good advice if too many people do.
I believe "learn to code" is a great advice, nonetheless; the skill is highly applicable. The bad idea is thinking that alone will land you a cushy job.
I'll observe that, at a top-rate tech school I'm pretty familiar with, major + computing is a very prevalent option in a lot of the majors. As an undergrad (pre-PCs), I graduated with one computer programming course in FORTRAN and that was pretty much the only time I touched a computer keyboard undergrad. You can't really do that today in engineering/sciences.
Anecdotally, but talking to a lot of people who really have their ears to the ground, the junior roles thing seems to be very real. It probably isn't just AI--with more senior folks probably more available, why hire juniors--but seems to be pretty pronounced (with probably the corollary that bootcamps are probably a bad idea these days). Which isn't a great trend if real.
Both things can be true. It can be tough for juniors and tough for seniors who haven’t kept up and are just trying to cruise. Not that age discrimination isn’t a thing.
'learn to code' is a great advice for anybody. If you're a biology major and need to check the world molecules database (forgot the name, sorry), being able to write your own query goes a long way despite the nocode solutions.
It's mainly #1. For 20 years now we've been hearing non-stop about how computer science is this magical major where anyone can sleepwalk out of college into a 150k job. Parents have been pushing their kids into it whether they are interested or not. Colleges have been taking advantage by pushing sub-par programs and boosting graduation rates. The end result is a large number of CS graduates who can't write a for loop in an interview (and will then loudly complain about how the interview process is unfair).
i just spoke with a chem prof who said that a lot of phd students in the degree sign up not because they want to do science but because of the salary bump the degree provides in industry.
i guess that is a natural dynamic in our economic/belief system in which all central planning must be inherently bad so we must always pay the on-demand price instead of the bulk price and every mis-timing mistake has to cost a lifetime of being wrong afterwards…
> 1. Overproduction. Even liberal arts colleges have 15-20% of students majoring in computer science. “Learn to code” ceases to be good advice if too many people do.
"Learn to code" was the scam to address the so-called "skills shortage" BS in programming. Even worse, the skills that was pushed were also the most automatable: HTML, CSS and especially Javascript just to get $250k roles which was the most unsustainable ZIRP era to happen.
Now you won't see the influencers screaming about web developer roles given the current massive flush in those who joined because of the $$$ just to rearrange a <div> or adding accessiblity styling for 6 figures.
‘Skills shortage’ is similar to complaining about STEM shortages. It’s mostly BS.
The complaint isn’t about n people not being available, it’s about n people not being available for x low price, or z terrible working conditions.
No matter how cheap or how widely available, some folks will still complain because for some folks, even if they had to pay $0, it still would be ‘too much’ if people also demanded human rights.
It’s similar to the ‘where have all the good men gone’, or ‘why don’t people want to work anymore?’, etc. complaints.
STEM is fairly meaningless in an employment context because biology/chemistry/math undergrads are generally in a different category than at least some engineering grads. And it's actually reasonable to think that those engineering grad salary expectations should be roughly in the ballpark of other professionals. They certainly used to be.
Do you have a link to the post (or posts) from Michael O. Church? I have a vague recollection of the idea but I would like to reread it with what I know today!
I actually think some of big tech cough Apple cough is a decent short right now. I wanted to do it back in December but it's hard to bring yourself to short the largest companies like that.
Tesla, both the company and the stock, is pretty complicated. I certainly wouldn't short it right now.
The problem with many of these tech companies is that they've been so successful abusing their users out that they've quit putting energy into developing their products. HP and Sonos are two good recent examples of how this ends.
Tesla doesn't seem to be doing that right now. The big thing you'd be be betting on (long or short) is how successful the robotaxi and optimis will be. I'm not optimistic with either of those (robo taxi seems like it should be practical, it's more about the particular execution) but I also wouldn't be willing to bet against them.
“Learn to code” ceases to be good advice if too many people do.
Completely disagree. No matter what job you end up with, you will almost certainly be able to do it a bit better if you know how to code. Knowing how to code is basically always a plus when applying for a job. However "just learn to code a little bit, and nothing else" is probably bad advice.
Everyone has finite amounts of ‘shits’ to give (albeit some activities multiply instead of subtract on that front!). If they spend it on coding instead of something else, hopefully it was worth it eh?
The open question is who’s worse: one major tyrant three thousand miles away or three thousand minor tyrants a mile away. If we consent to live under capitalism, then we’re destined to live in a world of lies. We always have; it’s all we’ve ever known.
What we don’t know is whether we’ll be worse or better off when the technology of forgery is available to random broke assholes as easily as it is to governments and companies. More slop seems bad, but if our immunity against bullshit improves, people might redevelop critical thinking skills and capitalism could end.
1. Overproduction. Even liberal arts colleges have 15-20% of students majoring in computer science. “Learn to code” ceases to be good advice if too many people do.
2. AI… sort of. It’s a lousy replacement for serious engineering talent, but the bosses are so enticed by reduced labor costs (and reduced employee leverage) that they will keep trying even if the stuff doesn’t work. Expectations are going up, teams are shrinking, and junior roles are vanishing.
3. Reputation collapse. Remember how we dismissed Michael O. Church as a crank? His writing style was grating (and has improved immensely) but he was right about everything, five years before anyone else. In 2009, we were “good rich people” in contrast to Wall Street. Now we’re Public Enemy #1 and, while we don’t all deserve it, our industry’s leadership does. This doesn’t hurt big tech companies because they’re invincible monopolies, but it has ended the era in which even non-tevh companies wanted three or four “data scientists.”