You already need point a) to be in place to comply with EU laws and directives (DSA, anti-terrorism [1]) anyway, and I think the UK has anti-terrorism laws with similar wording, and the US with CSAM laws.
Point b) is required if you operate in Germany, there have been a number of court rulings that platforms have to take down repetitive uploads of banned content [2].
Point c) is something that makes sense, it's time to crack down hard on "nudifiers" and similar apps.
Point d) is the one I have the most issues with, although that's nothing new either, unmasking users via a barely fleshed out subpoena or dragnet orders has been a thing for many many years now.
This thing impacts gatekeepers, so not your small mom-and-pop startup but billion dollar companies. They can afford to hire proper moderation staff to handle such complaints, they just don't want to because it impacts their bottom line - at the cost of everyone affected by AI slop.
[1] https://eucrim.eu/news/rules-on-removing-terrorist-content-o...
[2] https://www.lto.de/recht/nachrichten/n/vizr6424-bgh-renate-k...