←back to thread

Getting 50% (SoTA) on Arc-AGI with GPT-4o

(redwoodresearch.substack.com)
394 points tomduncalf | 1 comments | | HN request time: 0.001s | source
Show context
atleastoptimal ◴[] No.40714152[source]
I'll say what a lot of people seem to be denying. GPT-4 is an AGI, just a very bad one. Even GPT-1 was an AGI. There isn't a hard boundary between non AGI and AGI. A lot of people wish there was so they imagine absolutes regarding LLM's like "they cannot create anything new" or something like that. Just think: we consider humans a general intelligence, but obviously wouldn't consider an embryo or infant a general intelligence. So at what point does a human go from not generally intelligent to generally intelligent? And I don't mean an age or brain size, I mean suite of testable abilities.

Intelligence is an ability that is naturally gradual and emerges over many domains. It is a collection of tools via which general abstractive principles can be applied, not a singular universally applicable ability to think in abstractions. GPT-4, compared to a human, is a very very small brain trained for the single purpose of textual thinking with some image capabilities. Claiming that ARC is the absolute market of general intelligence fails to account for the big picture of what intelligence is.

replies(7): >>40714189 #>>40714191 #>>40714565 #>>40715248 #>>40715346 #>>40715384 #>>40716518 #
surfingdino ◴[] No.40714565[source]
> GPT-4 is an AGI, just a very bad one.

Then stop selling it as a tool to replace humans. A fast moving car breaking through a barrier and flying off the cliff could be called "an airborne means of transportation, just a very bad one" yet nobody is suggesting it should replace school busses if only we could add longer wings to it. What the LLM community refuses to see is that there is a limit to the patience and the financing the rest of the world will grant you before you're told, "it doesn't work mate."

> So at what point does a human go from not generally intelligent to generally intelligent?

Developmental psychology would be a good place to start looking for answers to this question. Also, forgetting scientific approach and going with common sense, we do not allow young humans to operate complex machinery, decide who is allowed to become a doctor, or go to jail. Intelligence is something that is not equally distributed across the human population and some of us never have much of it, yet we function and have a role in society. Our behaviour, choices, preferences, opinions are not just based on our intelligence, but often on our past experiences and circumstances. It is also not the sole quality we use to compare ourselves against each other. A not very intelligent person is capable of making the right choices (an otherwise obedient soldier refusing to press the button and blow up a building full of children); similarly, a highly intelligent person can become a hard-to-find serial criminal (a gynecologist impregnating his patients).

What intelligent and creative people hold against LLMs is not that they replace them, but that they replace them with a shit version of them relegating thousands of years of human progress and creativity to the dustbin of the models and layers of tweaks to the output that still produce unreliable crap. I think the person who wrote this sign summed it up best https://x.com/gvanrossum/status/1802378022361911711

replies(3): >>40714760 #>>40714937 #>>40718593 #
bongodongobob ◴[] No.40714760[source]
In response to the sign: then learn to code or make art that is better than AI art.

It's an existential complaint. "Why won't the nerds make something for meeeee." Do it yourself. Make that robot.

Sucks to think that you're not that special. Most art isn't. Most music isn't. Any honest artist will agree. Most professional artists are graphic designers, not brilliant once in a generation visionaries. It's the new excuse for starving artists. AI or no, they'd still be unsuccessful. That's the way it's always been.

replies(2): >>40714852 #>>40714951 #
earthnail ◴[] No.40714852[source]
While that is 100% true, the real problem is that the risk of finding out whether you can make special art has significantly increased. Previously if you didn’t make it as an artist, you could still earn money with other art related tasks - in graphics, many became illustrators. In music, people made music for ads.

That plan B is now going away, and a music career will be much more like a sports career: either you make it in football, or you need to find another career where your football skills won’t be very useful.

That is obviously scary for many.

replies(1): >>40714927 #
surfingdino ◴[] No.40714927[source]
Artists who make it usually have a legend, a story to tell or be told by their friends, associates, agents, publishers, gallerists, etc. That story has a human dimension that touches the rest of us and we somehow connect to it. Van Gogh cut of his ear, we still keep talking about it and wondering why? There is nothing AI can tell us about itself, its "art". The artistic struggle with AI is not about expressing your vision on a canvas in a way that makes others feel what you want them to feel but about forcing it to generate something it is incapable of generating or programmed not to generate. We got to the point where we are given crayons programmed to not draw the things others do not want them to draw or to draw HR-approved version of what the artist wants to draw. The future is now and it's shit.
replies(1): >>40715859 #
latexr ◴[] No.40715859{3}[source]
Using van Gogh as an example of “artists who make it” is insane.

Which I guess is appropriate, because he was literally crazy. He suffered from psychotic episodes and delusions and died from suicide, depressed and in poverty.

That’s the opposite of “making it”. It’s zero consolation that people like his work now, he never even knew.

replies(1): >>40716017 #
surfingdino ◴[] No.40716017{4}[source]
> Using van Gogh as an example of “artists who make it” is insane.

So is building a tool that will only generate "approved" art. We need to be able to express our idea, feelings, our perception of the world in ways that do not fit corporate standards of text, audio, or visual communication. It's part of being human.

replies(2): >>40716142 #>>40716922 #
1. latexr ◴[] No.40716142{5}[source]
> So is building a tool that will only generate "approved" art.

And so is eating ice cream with your forehead. Are we just doing non sequiturs now? I didn’t defend image generation tools in the slightest.

> We need to be able to express our idea, feelings, our perception of the world in ways that do not fit corporate standards of text, audio, or visual communication.

I agree. My point started and ended with “van Gogh in an awful example when talking about artist who ‘made it’”. That’s it. There is nothing in there to be extrapolated to AI or any other subject.