(malicious) Bot writers have exactly zero concern for robots.txt. Most bots are malicious. Most bots don't set most of the TCP/IP flags. Their only concern is speed. I block about 99% of port scanning bots by simply dropping any TCP SYN packet that is missing MSS or uses a strange value. The most popular port scanning tool is masscan which does not set MSS and some of the malicious user-agents also set some odd MSS values if they even set it at all.
-A PREROUTING -i eth0 -p tcp -m tcp -d $INTERNET_IP --syn -m tcpmss ! --mss 1280:1460 -j DROP
Example rule from the netfilter raw table. This will not help against headless chrome.
The reason this is useful is that many bots first scan for port 443 then try to enumerate it. The bots that look up domain names to scan will still try and many of those come from new certs being created in LetsEncrypt. That is one of the reasons I use the DNS method, get a wildcard and sit on it for a while.
Another thing that helps is setting a default host in ones load balancer or web server that serves up a default simple static page served from a ram disk that say something like, "It Worked!" and disable logging for that default site. In HAProxy one should look up the option "strict-sni". Very old API clients can get blocked if they do not support SNI but along that line most bots are really old unsupported code that the botter could not update if their life depended on it.