←back to thread

251 points slyall | 4 comments | | HN request time: 0.208s | source
1. hyperific ◴[] No.42067859[source]
The article mentions Support Vector Machines being the hot topic in 2008. Is anyone still using/researching these?

I often wonder how many useful technologies could exist if trends went a different way. Where would we be if neural nets hadn't caught on and SVMs and expert systems had.

replies(3): >>42068207 #>>42071734 #>>42072615 #
2. spencerchubb ◴[] No.42068207[source]
in insurance we use older statistical methods that are easily interpretable, because we are required to have rates approved by departments of insurance
3. bob1029 ◴[] No.42071734[source]
I've been looking at SVMs for use with a binary classification experiment. Training and operating these models is quite cheap. The tooling is ubiquitous and will run on a toaster. A binary decision made well can be extremely powerful. Multiple binary decisions underly... gestures broadly.

Obvious contextual/attention caveats aside, a few thousand binary classifiers operating bitwise over the training & inference sets would get you enough bandwidth for a half-ass language model. 2^N can be a very big place very quickly.

4. Legend2440 ◴[] No.42072615[source]
Expert systems did catch on and do see widespread use - they're just not called AI anymore. It's 'business logic' or 'rules engine' now.

The issue with SVMs is that they get intractably expensive for large datasets. The cost of training a neural network scales linearly with dataset size, while the kernel trick for SVMs scales with dataset size squared. You could never create an LLM-sized SVM.