No, this wasn't the case. While there were never comprehensive studies various tech media purchased these cards to run testing and found that, other than scammers, they all performed to expectation.
Can still take hours per drive though, which is why a lot of people skip it.
GPUs? No way. The datacenter cards don't even have video output ports, and I think the chips destined for AI / ML training also have everything video/render related removed from the silicon, makes for more yield.
And the other way around, using (cheap) consumer GPUs in servers, I think at least NVDA tries to prevent that with driver-based DRM, so there won't be any flooding coming from there either.
Even if it say, halved the life span of the chips, that is still far longer than what most people would ever use them for.