←back to thread

174 points Philpax | 1 comments | | HN request time: 0s | source
Show context
EliRivers ◴[] No.43719892[source]
Would we even recognise it if it arrived? We'd recognise human level intelligence, probably, but that's specialised. What would general intelligence even look like.
replies(8): >>43719970 #>>43719984 #>>43720087 #>>43720130 #>>43720153 #>>43720195 #>>43720300 #>>43725034 #
dingnuts ◴[] No.43719984[source]
you'd be able to give them a novel problem and have them generalize from known concepts to solve it. here's an example:

1 write a specification for a language in natural language

2 write an example program

can you feed 1 into a model and have it produce a compiler for 2 that works as reliably as a classically built one?

I think that's a low bar that hasn't been approached yet. until then I don't see evidence of language models' ability to reason.

replies(2): >>43720029 #>>43720149 #
1. logicchains ◴[] No.43720149[source]
You could ask Gemini 2.5 to do that today and it's well within its capabilities, just as long as you also let it write and run unit tests, as a human developer would.