security - password中文 - php random salt
僅僅將哈希輸出鏈接到輸入對安全性來說是不夠的。 迭代應該在保留密碼熵的算法的上下文中進行。 幸運的是，有幾個已發布的算法已經進行了足夠的審查，以提高其設計的信心。
一個好的密鑰派生算法，如PBKDF2將密碼輸入到每一輪哈希中，減輕了對哈希輸出中的衝突的擔憂。 PBKDF2可用於原樣密碼驗證。 Bcrypt通過加密步驟跟踪密鑰派生; 這樣，如果發現一種快速的方法來反轉密鑰派生，攻擊者仍然必須完成已知明文攻擊。
存儲的密碼需要防止脫機攻擊。 如果密碼不是鹽漬的，可以用預先計算的字典攻擊來破解密碼（例如，使用彩虹表）。 否則，攻擊者必須花時間計算每個密碼的散列值，並查看它是否與存儲的散列值相匹配。
所有密碼的可能性不盡相同。 攻擊者可能會徹底搜索所有短密碼，但他們知道，每個額外角色的暴力成功機會急劇下降。 相反，他們使用最有可能的密碼的有序列表。 他們從“password123”開始，並進入不經常使用的密碼。
假設攻擊者名單很長，有100億候選人; 還假設一個桌面系統可以計算每秒100萬次哈希值。 如果只使用一次迭代，攻擊者可以測試整個列表少於三個小時。 但如果只使用2000次迭代，那麼時間將延長至近8個月。 為了擊敗一個更複雜的攻擊者 - 例如能夠下載可以挖掘GPU功能的程序 - 你需要更多的迭代。
要使用的迭代次數是安全性和用戶體驗之間的折衷。 攻擊者可以使用的專用硬件很便宜，但它仍然可以每秒執行數億次迭代。 攻擊者係統的性能決定了在多次迭代中破解密碼需要多長時間。 但是你的應用程序不太可能使用這個專用硬件。 在不加重用戶的情況下，您可以執行多少次迭代取決於您的系統。
閱讀PKCS＃5，以獲取有關鹽和迭代在散列中作用的權威信息。 即使PBKDF2是用於從密碼生成加密密鑰的，它也可以用作密碼驗證的單向散列。 bcrypt的每次迭代比SHA-2哈希要昂貴，所以你可以使用更少的迭代，但這個想法是相同的。 通過使用派生密鑰對眾所周知的純文本進行加密，Bcrypt也超越了大多數基於PBKDF2的解決方案。 結果密文被存儲為“散列”，以及一些元數據。 但是，沒有什麼能阻止你對PBKDF2做同樣的事情。
$hashed_password = hash(hash($plaintext_password));
$hashed_password = hash($plaintext_password);
Absolutely do not use multiple iterations of a conventional hash function, like
md5(md5(md5(password))) . At best you will be getting a marginal increase in security (a scheme like this offers hardly any protection against a GPU attack; just pipeline it.) At worst, you're reducing your hash space (and thus security) with every iteration you add. In security, it's wise to assume the worst.
Do use a password has that's been designed by a competent cryptographer to be an effective password hash, and resistant to both brute-force and time-space attacks. These include bcrypt, scrypt, and in some situations PBKDF2. The glibc SHA-256-based hash is also acceptable.
Double hashing is ugly because it's more than likely an attacker has built a table to come up with most hashes. Better is to salt your hashes, and mix hashes together. There are also new schemas to "sign" hashes (basically salting), but in a more secure manner.
Double hashing makes sense to me only if I hash the password on the client, and then save the hash (with different salt) of that hash on the server.
That way even if someone hacked his way into the server (thereby ignoring the safety SSL provides), he still can't get to the clear passwords.
Yes he will have the data required to breach into the system, but he wouldn't be able to use that data to compromise outside accounts the user has. And people are known to use the same password for virtually anything.
The only way he could get to the clear passwords is installing a keygen on the client - and that's not your problem anymore.
- The first hashing on the client protects your users in a 'server breach' scenario.
- The second hashing on the server serves to protect your system if someone got a hold of your database backup, so he can't use those passwords to connect to your services.
I just look at this from a practical standpoint. What is the hacker after? Why, the combination of characters that, when put through the hash function, generates the desired hash.
You are only saving the last hash, therefore, the hacker only has to bruteforce one hash. Assuming you have roughly the same odds of stumbling across the desired hash with each bruteforce step, the number of hashes is irrelevant. You could do a million hash iterations, and it would not increase or reduce security one bit, since at the end of the line there's still only one hash to break, and the odds of breaking it are the same as any hash.
Maybe the previous posters think that the input is relevant; it's not. As long as whatever you put into the hash function generates the desired hash, it will get you through, correct input or incorrect input.
Now, rainbow tables are another story. Since a rainbow table only carries raw passwords, hashing twice may be a good security measure, since a rainbow table that contains every hash of every hash would be too large.
Of course, I'm only considering the example the OP gave, where it's just a plain-text password being hashed. If you include the username or a salt in the hash, it's a different story; hashing twice is entirely unnecessary, since the rainbow table would already be too large to be practical and contain the right hash.
Anyway, not a security expert here, but that's just what I've figured from my experience.
I'm going to go out on a limb and say it's more secure in certain circumstances... don't downvote me yet though!
From a mathematical / cryptographical point of view, it's less secure, for reasons that I'm sure someone else will give you a clearer explanation of than I could.
However , there exist large databases of MD5 hashes, which are more likely to contain the "password" text than the MD5 of it. So by double-hashing you're reducing the effectiveness of those databases.
Of course, if you use a salt then this advantage (disadvantage?) goes away.
Let us assume you use the hashing algorithm: compute rot13, take the first 10 characters. If you do that twice (or even 2000 times) it is possible to make a function that is faster, but which gives the same result (namely just take the first 10 chars).
Likewise it may be possible to make a faster function that gives the same output as a repeated hashing function. So your choice of hashing function is very important: as with the rot13 example it is not given that repeated hashing will improve security. If there is no research saying that the algorithm is designed for recursive use, then it is safer to assume that it will not give you added protection.
That said: For all but the simplest hashing functions it will most likely take cryptography experts to compute the faster functions, so if you are guarding against attackers that do not have access to cryptography experts it is probably safer in practice to use a repeated hashing function.
Most answers are by people without a background in cryptography or security. And they are wrong. Use a salt, if possible unique per record. MD5/SHA/etc are too fast, the opposite of what you want. PBKDF2 and bcrypt are slower (wich is good) but can be defeated with ASICs/FPGA/GPUs (very afordable nowadays). So a memory-hard algorithm is needed: enter scrypt .
Here's a layman explanation on salts and speed (but not about memory-hard algorithms).
The concern about reducing the search space is mathematically correct, although the search space remains large enough that for all practical purposes (assuming you use salts), at 2^128. However, since we are talking about passwords, the number of possible 16-character strings (alphanumeric, caps matter, a few symbols thrown in) is roughly 2^98, according to my back-of-the-envelope calculations. So the perceived decrease in the search space is not really relevant.
Aside from that, there really is no difference, cryptographically speaking.
Although there is a crypto primitive called a "hash chain" -- a technique that allows you to do some cool tricks, like disclosing a signature key after it's been used, without sacrificing the integrity of the system -- given minimal time synchronization, this allows you to cleanly sidestep the problem of initial key distribution. Basically, you precompute a large set of hashes of hashes - h(h(h(h....(h(k))...))) , use the nth value to sign, after a set interval, you send out the key, and sign it using key (n-1). The recepients can now verify that you sent all the previous messages, and no one can fake your signature since the time period for which it is valid has passed.
Re-hashing hundreds of thousands of times like Bill suggests is just a waste of your cpu.. use a longer key if you are concerned about people breaking 128 bits.
Yes - it reduces the number of possibly strings that match the string.
As you have already mentioned, salted hashes are much better.
An article here: http://websecurity.ro/blog/2007/11/02/md5md5-vs-md5/ , attempts a proof at why it is equivalent, but I'm not sure with the logic. Partly they assume that there isn't software available to analyse md5(md5(text)), but obviously it's fairly trivial to produce the rainbow tables.
I'm still sticking with my answer that there are smaller number of md5(md5(text)) type hashes than md5(text) hashes, increasing the chance of collision (even if still to an unlikely probability) and reducing the search space.