←back to thread

197 points baylearn | 2 comments | | HN request time: 0.4s | source
Show context
empiko ◴[] No.44471933[source]
Observe what the AI companies are doing, not what they are saying. If they would expect to achieve AGI soon, their behaviour would be completely different. Why bother developing chatbots or doing sales, when you will be operating AGI in a few short years? Surely, all resources should go towards that goal, as it is supposed to usher the humanity into a new prosperous age (somehow).
replies(9): >>44471988 #>>44471991 #>>44472148 #>>44472874 #>>44473259 #>>44473640 #>>44474131 #>>44475570 #>>44476315 #
1. delusional ◴[] No.44471988[source]
Continuing in the same vain. Why would they force their super valuable, highly desirable, profit maximizing chat-bots down your throat?

Observations of reality is more consistent with company FOMO than with actual usefulness.

replies(1): >>44472094 #
2. Touche ◴[] No.44472094[source]
Because it's valuable training data. Like how having Google Maps on everyone's phone made their map data better.

Personally I think AGI is ill-defined and won't happen as a new model release. Instead the thing to look for is how LLMs are being used in AI research and there are some advances happening there.