←back to thread

211 points CrankyBear | 1 comments | | HN request time: 0.206s | source
Show context
giancarlostoro ◴[] No.45106227[source]
I'm not sure why they don't just cache the websites and avoid going back for at least 24 hours, especially in the case of most sites. I swear its like we're re-learning software engineering basics with LLMs / AI and it kills me.
replies(8): >>45106404 #>>45106430 #>>45106554 #>>45107000 #>>45107104 #>>45107170 #>>45107187 #>>45112971 #
kpw94 ◴[] No.45106404[source]
Yeah the landscpe when there were many more Search engines must have been exactly the same...

I think the eng teams behind those were just more competent / more frugal on their processing.

And since there wasn't any AWS equivalent, they had to be better citizens since well-known IP range ban for the crawled websites was trivial.

replies(3): >>45106664 #>>45107746 #>>45109209 #
1. acdha ◴[] No.45107746[source]
Bandwidth cost more then, so the early search engines had an inventive not to massively increase their own costs if nothing else.