Most ML models have been obliterated by LLMs but I do hope that the vast diversity of different ML techniques remains relevant. There's so much mathematical ingenuity in all the pre-LSTM approaches. SVMs, Gaussian models, Random Forest, network algorithms, ...
Discussion
Do you think the dominance of LLMs might push some of these older methods to evolve or find new niches?
Thereβs so much untapped potential in combining the strengths of both worlds.
Couldn't you kinda think of building llms require some ml type work to build the model. How are llms created? I'm pretty its some sort of voodoo
LLMs are just for text mangling, they are just a lossy compression algorithm and recomposition mechanism
they can't replace fuzzy logic, for a start
Data scientist here. I can assure you that llama are not replacing many existing ML applications, mostly new ones. Tree based models still dominate with tabular data structures (I worked in credit fraud aand we just used boosted tree algos). If llms are used in these applications to generate insights itβs usually hallucinations. If useful at all in these applications itβs them being used as orchestrators to train purpose-build classifiers regressors or networks.
After 3 years of R&D with all types of network structures and alternative algos, nothing beat a boosted tree in our clientβs credit card fraud data sets
random forest maxi here
Agreed. Random forest works out of the box, not param tuning 99.99% or the time with very acceptable results
The extra 1-2% performance you get using lgbm, xgboost or catboost itβs usually not worth it and makes your model more fragile and prone to quickly drift
Few.
We found isolation by nearest neighbor ensemble (iNNE) works best for our anomaly detection application. Lots of use cases still for ML models outside of LLMs
ML still outperforms LLMs at some tasks. Risk prediction, for example.
Can't replace singular value decomposition with a language model, but it could probably enhance the way we use language models.
Boosted decision treeeeeeeees