←back to thread

174 points Philpax | 1 comments | | HN request time: 0.206s | source
Show context
EliRivers ◴[] No.43719892[source]
Would we even recognise it if it arrived? We'd recognise human level intelligence, probably, but that's specialised. What would general intelligence even look like.
replies(8): >>43719970 #>>43719984 #>>43720087 #>>43720130 #>>43720153 #>>43720195 #>>43720300 #>>43725034 #
dingnuts ◴[] No.43719984[source]
you'd be able to give them a novel problem and have them generalize from known concepts to solve it. here's an example:

1 write a specification for a language in natural language

2 write an example program

can you feed 1 into a model and have it produce a compiler for 2 that works as reliably as a classically built one?

I think that's a low bar that hasn't been approached yet. until then I don't see evidence of language models' ability to reason.

replies(2): >>43720029 #>>43720149 #
1. EliRivers ◴[] No.43720029[source]
I'd accept that as a human kind of intelligence, but I'm really hoping that AGI would be a bit more general. That clever human thinking would be a subset of what it could do.