If you don't do this, the third-party cookie blocking that strict Enhanced Tracking Protection enables will completely destroy your ability to access websites hosted behind CloudFlare, because it is impossible for CloudFlare to know that you have solved the CAPTCHA.
This is what causes the infinite CAPTCHA loops. It doesn't matter how many of them you solve, Firefox won't let CloudFlare make a note that you have solved it, and then when it reloads the page you obviously must have just tried to load the page again without solving it.
What is not within your rights is to require the site owner to build their own solution to your specs to solve those problems or to require the site owner to just live with those problems because you want to view the content.
It usually becomes reasonable to object to the status quo long before the legislature is compelled to move to fix things.
I do think that it's reasonable for the service to provide alternative methods of interacting with it when possible. Phone lines, Mail, Email could all be potential escape hatches. But if a site is on the internet it is going to need protecting eventually.
I don't know that "3rd party session cookies" or "JS" are reasonable objections, but I definitely have privacy concerns. And I have encountered situations where I wasn't presented with a captcha but was instead unconditionally blocked. That's frustrating but legally acceptable if it's a small time operator. But when it's a contracted tech giant I think it's deserving of scrutiny. Their practices have an outsized footprint.
> service to provide alternative methods of interacting with it when possible
One of the most obvious alternative methods is logging in with an existing account, but on many websites I've found the login portal barricaded behind a screening measure which entirely defeats that.
> if a site is on the internet it is going to need protecting eventually
Ah yes, it needs "protection" from "bots" to ensure that your page visit is "secure". Preventing DoS is understandable, but many operators simply don't want their content scraped for reasons entirely unrelated to service uptime. Yet they try to mislead the visitor regarding the reason for the inconvenience.
Or worse, the government operations that don't care but are blindly implementing a compliance checklist. They sometimes stick captchas in the most nonsensical places.