New offer! Get 30% off one whole year of Unlimited learning. Subscribe for just £249.99 £174.99. New subscribers only T&Cs apply

# Basics of Entropy for Password Security

This video will explain entropy, and how this concept relates to password security. Watch Zanidd unpack entropy as a measure of information.
11.4
Hello world. I’m Zanidd, and welcome back to the Hands On Password Cracking and Security course. Today we’re going to take a look at basics of entropy, hashing, and cryptography for password security. We will take a look at entropy, what is entropy, and how does entropy relate to passwords? We’re going to take a look at hashing, and what one way hash functions are and what they aren’t. And we are going to take a look at cryptography, how does it relate to passwords? And we’re also going to take a look at cryptography - what are the goals of cryptography, and where do passwords fit in, in these goals. Entropy.
59
Entropy is basically a measure of information, which means the higher the entropy is, the more possible combinations are possible. Entropy is measured in Bit and is based on probability. If all outcomes are equally likely, we can use log2 of n to calculate the entropy. Experiments in the context of entropy and probability is choosing an outcome. For example, if you throw a dice and you get a number from the dice, this would be an experiment because the probability of a dice falling on that side has a certain percentage. Entropy is additive if the experiments are independent. What does that mean?
117.1
If we go back to our dice examples, if we throw our dice three times, the first outcome does not influence the second outcome, and the second outcome does not influence the third outcome. So, the experiments are independent, and the entropy is additive. Good, but what does this have to do with passwords? Basically choosing a password is the result of a random experiment. Let’s take a look at how entropy and passwords relate. We have a password consisting of six alphabetic character, A to Z. For example, FGSHTP, and we’re only using uppercase letters for this example. So, we have 2 to the power of 4.7 possibilities per character, which means our entropy for one character is 4.7 bits.
180.4
H is the symbol for entropy. Now, since choosing a character is an experiment, and entropy is additive, and choosing the first character does not influence how we choose the second character, we can use the additive entropy property and calculate the total entropy of this password, which will be 28.2 Bits. Another example would be if we have a constant string, “hello world”, which would mean every password used is this string. Since there is only one password, the entropy is log2 of 1 which is zero Bits. There’s no new information for us because it’s always the same password. Another example is picking a password from a list of 1,024 words.
250.3
1,024 equals 2 to the power of 10 possible words that we can choose from. Since all passwords are equally likely, we can choose one from the list at random. And our entropy is calculated with log2 of 2 to the power of 10, which will be 10 Bits. The entropy of a password stands in relation to the work factor to correct that password. So, in short, it means that the time to crack a password is more or less relative to the entropy of a password. So, we will aim for our passwords to have a higher entropy so it will take longer to crack them.
299.9
Examples to increase the entropy would be instead of just using upper characters like in the first example, we now can also use lower characters, numbers and a special character, which will increase the entropy.

This video will explain entropy, and how this concept relates to password security.

As we will see, entropy is a measure of information, which is measured in bit. The higher the entropy is, the higher number of possible combinations (of letters, characters, etc.) in the password you are trying to crack.