←back to thread

412 points xfeeefeee | 2 comments | | HN request time: 0s | source
Show context
godelski ◴[] No.43748662[source]
This seems like quite a lot of work to hide the code. What would the legitimate reasons for this be? Because it looks like it would make the program less optimized and more complexity just leads to more errors.

I understand the desire to make it harder for bots, but 1) it doesn't seem to be effective and bots seem to be going a very different route 2) there's got to be better ways that are more effective. It's not like you're going to stop clones through this because clones can replicate by just seeing how things work and reverse engineer blackbox style.

replies(8): >>43748681 #>>43748712 #>>43748741 #>>43748839 #>>43749167 #>>43749282 #>>43750130 #>>43752385 #
1. davidsojevic ◴[] No.43748681[source]
Making it harder for bots usually means that it drives up the cost for the bots to operate; so if they need to run in a headless browser to get around the anti-bot measures it might mean that it takes, for example, 1.5 seconds to execute a request as compared to the 0.1 seconds it would without them in place.

On top of that 1.5 seconds is also that there is a much larger CPU and memory cost from having to run that browser compared to a simple direct HTTP request which is near negligible.

So while you'll never truly defeat a sufficiently motivated actor, you may be able to drive their costs up high enough that it makes it difficult to enter the space or difficult to turn a profit if they're so inclined.

replies(1): >>43756558 #
2. godelski ◴[] No.43756558[source]
I understand the argument. You can't have perfect defense and speedbumps are quite effective. I'm not trying to disagree with that.

But it does not seem like the solution is effective at mitigating bots. Presumably bots are going a different route considering how prolific they are, which warrants another solution. If they are going through this route then it certainly isn't effective either and also warrants another solution.

It seems like this obscurification requires a fair amount of work, especially since you need to frequently update the code to rescramble it. Added complexity also increases risks for bugs and vulnerabilities, which ultimately undermine the whole endeavor.

I'm trying to understand why this level of effort is worth the cost. (Other than nefarious reasons. Those ones are rather obvious)