Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Good old fashion ML is dying
20 points by gdiamos on Dec 2, 2023 | hide | past | favorite | 16 comments
LLMs popularized zero-shot learning, or “prompt engineering” which is drastically easier to use and more effective than labeling data.

You can also retrofit “prompt engineering” onto good old fashion ML like text classifiers. I wrote a library to do just that here: https://github.com/lamini-ai/llm-classifier

IMO, it’s a short matter of time before this takes over all of what used to be called “deep learning”. Supervised learned, which required Herculean efforts to label big datasets was important to prove that deep learning worked, but we have been engineering it to be easier and more effective ever since and there is no going back.



Before I clicked I did not think "good old fashion ML" would refer to deep learning. :-)


I must be stuck in 2015 still. I thought it would be about the death of SVMs, or ensemble models XGBoost or something. I'm still catching up on RNNs, GANs and god knows what else lol.


Hell, plain old linear regression is still going strong. The implication in the title of OP (that classical ML techniques are dying) is incorrect, though the body (that unsupervised learning is taking over from supervised learning) is spot-on.


Yes, linear regression is often quite good, especially if you have a good sense of what transformations you already need to do to the data to make linear regression work well. I think that many people unfortunately think that advanced techniques will somehow make up for deficient data (either poor quality data or too little data or both). Or worse, they think that advanced techniques will reduce the amount of thinking they have to do. Advanced techniques usually don't do either. Garbage in, garbage out as the saying goes. Simple techniques can often get good answers in much less time.


Any books that explain well linear regression?


ISLR (aka "An Introduction to Statistical Learning: With Applications in R") is a great book on the principles of machine learning (including regression). And you get practical experience using R to actually implement such applications.


How beginner friendly do you think the book is? Asking as someone who's completely new to R (and data-related fields in general) who thinks the book might be interesting.


If you know how to program in other languages, the concepts and practical examples will make perfect sense (even if the math occasionally doesn't).


OP, the term is “old-fashioned” not “old fashion”.


OP is obviously concerned about poodle skirts


> more effective than labeling data.

This is a BIG (and fashionable) claim, with very little by way of rigorous evidence. It's likely less work, but it's not at all clear whether this can match the necessary accuracy/robustness.


Good old-fashioned ML is a big collection of techniques applied to a wide range of problems in many fields.

You're only talking about supervised learning in the field of NLP/text processing, right?


https://news.ycombinator.com/item?id=38499723

A thread below has the contrary opinion :P


Is there a length limitation to the prompts you can send to it? Or the actual data you want to classify?


Confidence is all you need.


Long live meta language




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: