Is there an algorithm for calculating entropy in a passphrase?
This is an important question that unfortunately does not have an easy answer. If a passphrase is selected from a universe of N possibilities, where each possibility is equally likely to be chosen, the entropy is log2(N). The symbol “log2” stands for the base-two logarithm. Most calculators don’t have a button for base-2 logarithms, but you can easily compute one using the formula: log2(N)=log(N)/log(2). If the passphrase is made out of M symbols, each chosen at random from a universe of N possibilities, each equally likely, the entropy is M*log2(N). For example, if you make a passphrase by choosing 10 letters at random, the entropy is 10*log2(26) = 47.0 bits. If the passphrase is a phrase in a natural language, the problem is much more difficult. There is a famous estimate due to Shannon that the average entropy of written English is about 1.3 bits per letter. See Schneier’s Applied Cryptography, 2nd Ed. p.234. However applying this estimate to a passphrase is questionable. People are