Direct quote from the article: "The companies are facing several challenges. It’s become increasingly difficult to find new, untapped sources of high-quality, human-made training data that can be used to build more advanced AI systems."
The irony here is astounding.
Indeed, if thinking about AI polluting the data and replacing humans. However, it also seems likely in the near term that training will go to the source because of this, that increasingly humans will directly train AI's, as the robotics and self driving car systems are doing, instead of training off the indirect data people create (watching someone paint rather than scanning paintings). So in essence we'll be training our replacements to take our tasks/jobs. Small tasks at first, but increasing in complexity over time. Someday no one may know how to drive a car anymore (or be allowed to for safety). Later on no one may know how to write computer code (or be allowed to for security reasons). Learning in each area mastered by AI will stop and never progress further, unless AI can truly become creative. Or perhaps (fewer and fewer) people will only work on new problems that require creativity. There are long term risks to humanities adaptability in this scenario. People would probably take those risks for the short term gains.