Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

Is there an algorithm for calculating entropy in a passphrase?

0
Posted

Is there an algorithm for calculating entropy in a passphrase?

0

This is an important question that unfortunately does not have an easy answer. If a passphrase is selected from a universe of N possibilities, where each possibility is equally likely to be chosen, the entropy is log2(N). The symbol “log2” stands for the base-two logarithm. Most calculators don’t have a button for base-2 logarithms, but you can easily compute one using the formula: log2(N)=log(N)/log(2). If the passphrase is made out of M symbols, each chosen at random from a universe of N possibilities, each equally likely, the entropy is M*log2(N). For example, if you make a passphrase by choosing 10 letters at random, the entropy is 10*log2(26) = 47.0 bits. If the passphrase is a phrase in a natural language, the problem is much more difficult. There is a famous estimate due to Shannon that the average entropy of written English is about 1.3 bits per letter. See Schneier’s Applied Cryptography, 2nd Ed. p.234. However applying this estimate to a passphrase is questionable. People are

Related Questions

What is your question?

*Sadly, we had to bring back ads too. Hopefully more targeted.

Experts123