The best way to mitigate the load from diffuse, unidentifiable, grey area participants is to have a fast and well engineered web product. This is good news, because your actual human customers would really enjoy this too.
The best way to mitigate the load from diffuse, unidentifiable, grey area participants is to have a fast and well engineered web product. This is good news, because your actual human customers would really enjoy this too.
In that regard reading my logs led me sometimes to interesting articles about cyber security. Also log flooding may result in your journaling service truncating the log and you miss something important.
Yeah, this is beyond irresponsible. You know the moment you're pwned, __you__ become the new interesting story?
For everyone else, use a password manager to pick a random password for everything.
So unless you're not logging your request path/query string you're doing something very very wrong by your own logic :). I can't imagine diagnosing issues with web requests and not be given the path + query string. You can diagnose without but you're sure not making things easier
Until AI crawlers chased me off of the web, I ran a couple of fairly popular websites. I just so rarely see anybody including passwords in the URLs anymore that I didn't really consider that as what the commenter was talking about.
You're absolutely right. That's my mistake — you are requesting a specific version of WordPress, but I had written a Rails app. I've rewritten the app as a WordPress plugin and deployed it. Let me know if there's anything else I can do for you.