←back to thread

625 points lukebennett | 1 comments | | HN request time: 0.22s | source
Show context
aaroninsf ◴[] No.42139331[source]
It's easy to be snarky at ill-informed and hyperbolic takes, but it's also pretty clear that large multi-modal models trained with the data we already have, are going to eventually give us AGI.

IMO this will require not just much more expansive multi-modal training, but also novel architecture, specifically, recurrent approaches; plus a well-known set of capabilities most systems don't currently have, e.g. the integration of short-term memory (context window if you like) into long-term "memory", either episodic or otherwise.

But these are as we say mere matters of engineering.

replies(2): >>42139463 #>>42139929 #
tartoran ◴[] No.42139463[source]
> pretty clear

Pretty clear?

replies(1): >>42139864 #
falcor84 ◴[] No.42139864[source]
Not the parent, but in prediction markets such as Metaculus[0] and Manifold[1] the median prediction is of AGI within 5 years.

[0] https://www.metaculus.com/questions/5121/date-of-artificial-...

[1] https://manifold.markets/ai

replies(2): >>42140155 #>>42140214 #
1. dbbk ◴[] No.42140214[source]
What is this supposed to be evidence of? People believing hype?