←back to thread

646 points blendergeek | 1 comments | | HN request time: 0.526s | source
1. RamblingCTO ◴[] No.42734829[source]
Why wouldn't a max-depth (which I always implement in my crawlers if I write any) prevent any issues you'd have? Am I overlooking something? Or does it run under the assumption that the crawlers they are targeting are so greedy that they don't have max-depth/a max number of pages for a domain?