←back to thread

265 points ctoth | 1 comments | | HN request time: 0.21s | source
Show context
plaidfuji ◴[] No.43748358[source]
Gemini 2.5 Pro is certainly a tipping point for me. Previous LLMs have been very impressive, especially on coding tasks (unsurprising as the answers to these have a preponderance of publicly available data). But outside of a coding assistant, LLMs til now felt like an extra helpful and less garbage-filled Google search.

I just used 2.5 Pro to help write a large research proposal (with significant funding on the line). Without going into detail, it felt to me like the only reason it couldn’t write the entire thing itself is because I didn’t ask it to. And by “ask it”, I mean: enter into the laughably small chat box the entire grant solicitation + instructions, a paragraph of general direction for what I want to explore, and a bunch of unstructured artifacts from prior work, and turn it loose. I just wasn’t audacious enough to try that from the start.

But as the deadline approached, I got more and more unconstrained in how far back I would step and let it take the reins - doing essentially what’s described above but on isolated sections. It would do pretty ridiculously complex stuff, like generate project plans and timelines, cross reference that correctly with other sections of text, etc. I can safely say it was a 10x force multiplier, and that’s being conservative.

For scientific questions (ones that should have publicly available data, not ones relying on internal data), I have started going to 2.5 Pro over senior experts on my own team. And I’m convinced at this point if I were to connect our entire research data corpus to Gemini, that balance would shift even further. Why? Because I can trust it to be objective - not inject its own political or career goals into its answers.

I’m at the point where I feel the main thing holding back “AGI” is people’s audacity to push its limits, plus maybe context windows and compute availability. I say this as someone who’s been a major skeptic up until this point.

replies(9): >>43748425 #>>43749118 #>>43749224 #>>43751750 #>>43753576 #>>43755736 #>>43756318 #>>43756466 #>>43812541 #
MoonGhost ◴[] No.43749224[source]
LLMs at this point are stateless calculators without personal experience, life goals, obligations, etc. Till recently people expected to have a character like Terminator or HAL. Now we have intelligence separate from 'soul'. Can calculator be AGI? It can be Artificial, General, and Intelligence. We may need another word for 'creature' with some features of living being.
replies(1): >>43750519 #
dcow ◴[] No.43750519[source]
The term AI has always bothered me for this reason. If the thing is intelligent, then there’s nothing artificial about it… it’s almost an oxymoron.

There are two subtly different definitions in use: (1) “like intelligence in useful ways, but not actually”, and (2) “actually intelligent, but not of human wetware”. I take the A in AGI to be of type (2).

LLMs are doing (1), right now. They may have the “neurological structure” required for (2), but to make a being General and Intelligent it needs to compress its context window persist it to storage every night as it sleeps. It needs memory and agency. It needs to be able to learn in real time and self-adjusting its own weights. And if it’s doing all that, then who is to say it doesn't have a soul?

replies(1): >>43750620 #
Jensson ◴[] No.43750620[source]
> If the thing is intelligent, then there’s nothing artificial about it… it’s almost an oxymoron.

Artificial means human made, if we made a thing that is intelligent, then it is artificial intelligence.

It is like "artificial insemination" means a human designed system to inseminate rather than the natural way. It is still a proper insemination, artificial doesn't mean "fake", it just means unnatural/human made.

replies(2): >>43750727 #>>43754511 #
europeanNyan ◴[] No.43750727[source]
> Artificial means human made, if we made a thing that is intelligent, then it is artificial intelligence.

Aren't humans themselves essentially human made?

Maybe a better definition would be non-human (or inorganic if we want to include intelligence like e.g. dolphins)?

replies(3): >>43750798 #>>43752307 #>>43754375 #
1. butlike ◴[] No.43752307[source]
"ii" (inorganic intelligence) has a better ring to it than AI and can also be stylized as "||" which means OR.