If you block those internal subdomains from search with robots.txt, does Google still whine?
replies(2):
Possible scenario:
- A self-hosted project has a demo instance with a default login page (demo.immich.app, demo.jellyfin.org, demo1.nextcloud.com) that is classified as "primary" by google's algorithms
- Any self-hosted instance with the same login page (branding, title, logo, meta html) becomes a candidate for deceptive/phishing by their algorithm. And immich.cloud has a lot of preview envs falling in that category.
BUT in Immich case its _demo_ login page has its own big banner, so it is already quite different from others. Maybe there's no "original" at all. The algorithm/AI just got lost among thousands of identically looking login pages and now considers every other instance as deceptive...