←back to thread

625 points lukebennett | 1 comments | | HN request time: 0.001s | source
Show context
Timber-6539 ◴[] No.42141687[source]
Direct quote from the article: "The companies are facing several challenges. It’s become increasingly difficult to find new, untapped sources of high-quality, human-made training data that can be used to build more advanced AI systems."

The irony here is astounding.

replies(2): >>42142698 #>>42145740 #
rapjr9 ◴[] No.42142698[source]
Indeed, if thinking about AI polluting the data and replacing humans. However, it also seems likely in the near term that training will go to the source because of this, that increasingly humans will directly train AI's, as the robotics and self driving car systems are doing, instead of training off the indirect data people create (watching someone paint rather than scanning paintings). So in essence we'll be training our replacements to take our tasks/jobs. Small tasks at first, but increasing in complexity over time. Someday no one may know how to drive a car anymore (or be allowed to for safety). Later on no one may know how to write computer code (or be allowed to for security reasons). Learning in each area mastered by AI will stop and never progress further, unless AI can truly become creative. Or perhaps (fewer and fewer) people will only work on new problems that require creativity. There are long term risks to humanities adaptability in this scenario. People would probably take those risks for the short term gains.
replies(1): >>42144469 #
1. Timber-6539 ◴[] No.42144469[source]
You are correct to state over-reliance on AI as a data source will probably lead to society's intellectual atrophy. One could argue we have been on this path with other things but the whole thing more and more to me looks like eating your own vomit and forcing a smile on your face.

AI will always have a specific narrow focus and will never ever be creative, the best AI proponents can hope for is that the hallucinations will drop to a more unnoticable level.