If infringement is happening, it arguably doesn't happen when an infringing work product is generated (or regurgitated, or whatever you want to call it.) Much less when the model is trained. It's when the output is used commercially -- by a human -- that the liability should rightfully attach.
And it should attach to the human, not the tool.