←back to thread

127 points Brajeshwar | 1 comments | | HN request time: 0.201s | source
Show context
underseacables ◴[] No.42479808[source]
I suppose it comes down to what the purpose of such archiving is.

I think it's the preservation of information, but I also believe 90% is absolutely pointless. There is just so much of it, and data storage so cheap, that it makes sense to just save everything.

replies(3): >>42479956 #>>42479985 #>>42480107 #
sigio ◴[] No.42479956[source]
Well... storage is cheap, but not cheap enough to save everything, with just usenet being in the 400TB/day range these days. Sure, it's cheap enough to save every webpage you visit during your life, but probably not cheap enough to save every video you click on youtube or watch on a streaming-service, and all the music you listen to all day.

Though just the music compressed in opus at 128kbit might work ok, 60 years of 24/7 128kbit is 30TB, so that would fit on 1 large HDD currently.

replies(2): >>42480571 #>>42481375 #
1. add-sub-mul-div ◴[] No.42481375[source]
If that much data comes across Usenet daily then how do services afford the storage to offer years of retention?

You can't dedupe the large binary files because they're encoded in small parts likely differently every time they're posted.