←back to thread

625 points lukebennett | 5 comments | | HN request time: 0s | source
Show context
nerdypirate ◴[] No.42139075[source]
"We will have better and better models," wrote OpenAI CEO Sam Altman in a recent Reddit AMA. "But I think the thing that will feel like the next giant breakthrough will be agents."

Is this certain? Are Agents the right direction to AGI?

replies(7): >>42139134 #>>42139151 #>>42139155 #>>42139574 #>>42139637 #>>42139896 #>>42144173 #
xanderlewis ◴[] No.42139151[source]
If by agents you mean systems comprised of individual (perhaps LLM-powered) agents interacting with each other, probably not. I get the vague impression that so far researchers haven’t found any advantage to such systems — anything you can do with a group of AI agents can be emulated with a single one. It’s like chaining up perceptrons hoping to get more expressive power for free.
replies(2): >>42139320 #>>42139568 #
1. falcor84 ◴[] No.42139568[source]
> It’s like chaining up perceptrons hoping to get more expressive power for free.

Isn't that literally the cause of the success of deep learning? It's not quite "free", but as I understand it, the big breakthrough of AlexNet (and much of what came after) was that running a larger CNN on a larger dataset allowed the model to be so much more effective without any big changes in architecture.

replies(1): >>42139912 #
2. david2ndaccount ◴[] No.42139912[source]
Without a non-linear activation function, chaining perceptrons together is equivalent to one large perceptron.
replies(1): >>42141849 #
3. xanderlewis ◴[] No.42141849[source]
Yep. falcor84: you’re thinking of the so-called ‘multilayer perceptron’ which is basically an archaic name for a (densely connected?) neural network. I was referring to traditional perceptrons.
replies(1): >>42142074 #
4. falcor84 ◴[] No.42142074{3}[source]
While ReLU is relatively new, AI researchers have been aware of the need for nonlinear activation functions and building multilayer perceptrons with them since the late 1960s, so I had assumed that's what you meant.
replies(1): >>42142428 #
5. xanderlewis ◴[] No.42142428{4}[source]
It was a deliberately historical example.