←back to thread

430 points tambourine_man | 1 comments | | HN request time: 0.216s | source
Show context
chrisshroba ◴[] No.41879813[source]
71 bits of entropy feels rather low...

It seems like many recommendations are to use at least 75-100, or even 128. Being fairly conservative, if you had 10k hosts hashing 1B passwords a second, it would take 7.5 years worst case to crack [1]. If a particular site neglects to use a slow hash function and salting, it's easy to imagine bad actors precomputing rainbow tables that would make attacks relatively easy.

You can rebut that that's still a crazy amount of computation needed, but since it's reusable, I find it easy to believe it's already being done. For comparison, if the passwords have 100 bits of entropy, it would take those same 10k servers over 4 billion years to crack the password.

[1]: (2*71 / 1e9 / 10000 / (606024*365)) ≈ 7.5

replies(2): >>41879897 #>>41886162 #
1. lordofmoria ◴[] No.41879897[source]
I think the assumption is that this is going into a somewhat modern hashing algorithm like argon, bcrypt (created 1999 - that's a quarter-century ago), or scrypt with salt. With those assumptions, the calculations aren't reusable, and definitely not 1B passwords / second.

If that's not true and the password is being stored using MD5 (something that's been NIST-banned at this point for over a decade), then honestly all bets are off, and even 128 bits of entropy might not be enough.