brinks56 wrote:Can someone answer a question for me though? At the end of the article they talk about Bcrypt, PBKDF2 and another agorithm and how the cluster would be able to perform far fewer guesses on them. Is this because of additional iterations performed (and other stuff done that I am not familiar with yet), or do they also increase the size of the hash in the ? I tried to look a bit to find out but I could not be sure. .
bcrypt, pbkdf2 and sha512 are far more resource intensive than say a sha1 or MD5. They requires more calculations, more cycles, more memory, so of course they will take longer to calculate. As a result, you will have less full completions in the same amount of time as a lighter algorithm such as MD5.
There are additional factors to consider. Bcrypt, for example, is not considered "GPU-friendly" and will actually run slow on GPUs than CPUs. This is because, although GPUs are good at some things, at the moment they suffer in other areas, like 64-bit operations, data-dependent branching, and large memory operations.
Just as a note, simply increasing iterations will not always give you the results you expect. Bcrypt (and to some degree pbkdf2) is considered an adaptive algorithm, and can cycle though rounds (iterations) without adversely affecting the algorithm. MD5, on the other hand, is not an adaptive algorithm, and as you further iterate the algorithm, you will actually increase the collision domains of your hashes, weakening the hash exponentially with each iteration.