An underrated aspect of Matlab is its call-by-value semantics. Function arguments are copied by default. Python+NumPy is call-by-reference; mutations to array arguments are visible to the caller. This creates a big class of bugs that is hard for non-programmers to understand.
Clarity in writing comes mostly from the logical structure of ideas presented. Writing can have grammar/style errors but still be clear. If the structure is bad after translation, then it was bad before translation too.
The (non-tech) industry I am in generates an enormous amount of text that it's fairly certain nobody reads past the executive summary.
My workmates love it. Amongst the tech community, I see a divide very similar to the crypto one - everybody who has a stake in it succeeding is very optimistic. Everybody working in other areas seems dubious at best.
Generating huge amounts of stylistically awful ultra-verbose nonsense isn’t actually useful in any industry other than the ultra-low-end journalism/blogspam space. Either no-one reads it anyway, in which case it’s more or less a wash, or someone actually has to read it, in which case it’s a productivity drain.
It's a productivity boost for the people who have to generate the text that nobody reads. At an organisational level it's a wash caused by requiring that text in the first place
I don't think any tech has ever been pushed so hard in front of people so fast. People hated facebook for ages but you could just not use it. While these AI features are shoved in front of your face constantly. Google sticks them at the top of every search, reddit sticks generated slop on every page, every SaaS has rebranded themselves as an AI tool with constant popups telling you to use the new AI feature.
Most other crappy tech you could just choose to not use.
Not that I can think of, but the reason I think it happened for AI so quickly:
AI enshittified way, WAY too quickly.
The thing is that all these tech companies are really just innovating new ways to scam consumers into adopting something that's worse for them. They just subsidize the bad stuff and, eventually, have to start bleeding consumers dry.
Uber is now more expensive than taxis, AirBNB is more expensive than hotels, placing an order online is more inconvenient than calling, and on and on. But it took decades for this to transpire. For a long time, these new things were actually better.
But AI was pushed so hard, so severely, that it became enshittified way too quick. And consumers are already on guard after seeing tech A-Z slowly make their life worse.
I don't think that this is tree. I'm a high school student, and I've overheard quite a few conversations between our admin about whether they prefer Microsoft Copilot or ChatGPT.
That may not be because they like it but because they're required to use it. The teachers in my area, at least, are mandated to use AI themselves and integrate it into their curriculum.
What do you mean by tech and compsci bubble? Many of the software engineers I interact with don’t seem all that optimistic or positive about the AI tools. There are bubbles for either side I think.
But I’m one of those who hasn’t had great experiences using them for anything beyond toy projects.. so maybe my bubble falls more on the AI skeptic side.
I think it's the money-generating part of "tech".. they see this "glorified spellcheck", assume endless possibilities that everyone will want to buy, and are busy placing "buy AI now" buttons everywhere.. (well, more like "Try it, try it, you'll be amazed, and if you give us money for the premium version you'll be even more amazed!" buttons)
The definition of big O notation is pure math - there is nothing specific to analysis of algorithms.
For example: "the function x^2 is O(x^3)" is a valid sentence in big-O notation, and is true.
Big O is commonly used in other places besides analysis of algorithms, such as when truncating the higher-order terms in a Taylor series approximation.
Another example is in statistics and learning theory, where we see claims like "if we fit the model with N samples from the population, then the expected error is O(1/sqrt(N))." Notice the word expected - this is an average-case, not worst-case, analysis.
tangentially, does anyone know a good way to limit web searches to the "low-background" era that integrates with address bar, OS right-click menus, etc? I often add a pre-2022 filter on searches manually in reaction to LLM junk results, but I'd prefer to have it on every search by default.
IDK, I think Apple creating its own laptop/desktop-class CPU was a pretty bold move with a huge payoff. It's less sexy than introducing an entirely new category of product, but it's not exactly risk-averse either.