Importance of Entropy in Cryptography
You can measure the quality of cryptographic functions by finding the entropy of their outputs. We chose the algorithm with high entropy for encryption and hash functions.
Moreover, entropy is crucial for generating random inputs, like security keys, nonces, and initialization vectors, for cryptographic algorithms. Since these values should be purely random, i.e., non-predictable and completely secret, we need to have high entropy sources of data.
Calculating Entropy
Let's first understand entropy by an example.
Suppose you have a discrete random variable X, which denotes the toss of a coin. We know that there are only two possible outcomes, heads, and tails.
So, Probability[heads] = Probability[tails] = ½.
Now, what is the entropy of a coin toss?
The entropy or the information of the coin toss is 1 bit. Why? As we can encode the two outcomes, heads and tails, by using 1 bit, say, 1 for heads and 0 for tails.
Similarly, if you have n coin tosses, then the entropy will be n bits, one bit for each coin toss. You can encode the n coin tosses by a binary string of length n, where the i’th character represents the result of the i’th toss.
Basically, the entropy of a random variable X following a specific probability distribution is nothing but the number of bits by which we can encode it.
More formally, we have -
An outcome occurring with probability p might be encoded by a bitstring of length: ceil(log2p).
Moving towards a more formal definition of entropy, we have -
Suppose X is a discrete random variable that takes on values from a finite set X. Then, the entropy H(X) of the random variable X is defined to be the quantity

Where Pr[x] is the probability of x.
Some of the interesting points about H(X) are
- A particular case, for which |X| = n and Pr[x] = 1/n for all x∈X, H(X) is equal to logn(base 2).
- H(X) >= 0 for any random variable X.
-
H(X) = 0 if and only if Pr[x0] = 1 for some x0 ∈ X and Pr[x] = 0 for all x ≠ x0.
How to Compute the Entropy of a cryptosystem?
Suppose the random variables associated with the key, plaintext, and ciphertext are K, P, and C, respectively.
Then we can find the entropies for each of them individually as H(K), H(P), and H(C).
Properties of Entropy
In this section, we will see some of the fundamental results related to entropy, along with their proofs.
Before moving further, let's introduce a theorem, “Jensen’s inequality,” which is a prerequisite for the properties we discuss.
Jensen’s inequality
Suppose f is a continuous strictly concave function on the interval I. Suppose further that

and ai > 0, 1 ≤ i ≤ n. Then

where xi ∈ I, 1 ≤ i ≤ n. Additionally, equality occurs if and only if x1 = · · · = xn.
Theorem-1
Suppose X is a random variable having a probability distribution that takes on the values p1, p2, . . . , pn, where pi > 0, 1 ≤ i ≤ n. Then H(X) ≤ log2 n, with equality, if and only if pi = 1/n, 1 ≤ i ≤ n.
Let's see the proof of this theorem below.
Applying Jensen’s inequality, we get:

We can see that equality occurs if and only if pi = 1/n, 1 ≤ i ≤ n
Theorem-2
H(X, Y) ≤ H(X) + H(Y), with equality if and only if X and Y are independent random variables.
Let's see the proof of this theorem below.
We compute H(X) + H(Y) as follows:

Here,
- pi = Pr[X = xi ], 1 ≤ i ≤ m
- qj = Pr[Y = yj ], 1 ≤ j ≤ n
- rij = Pr[X = xi , Y = yj ], 1 ≤ i ≤ m, 1 ≤ j ≤ n, where Pr[X, Y] is joint probability distribution
Also,
and
.
Similarly, we compute H(X, Y) as

Now, we combine the above two equations as

You can observe that the equality, i.e., H(X, Y) = H(X) + H(Y), occurs when there is a constant
c such that piqj/rij = c for all i, j.
It follows that c=1 according to

So, equality occurs if and only if rij = piqj orPr[X = xi , Y = yj ] = Pr[X = xi ]Pr[Y = yj ].
This means that X and Y are independent, hence, proved.
Conditional Entropy
The conditional entropy of two random variables, X and Y, H(X|Y), is defined as the average amount of information about X that Y. does not reveal.
We calculate the weighted average of the entropies H(X|y) over all possible values y of Y to find the value of H(X|Y).
H(X|y), i.e., the conditional probability distribution of X for a fixed value y of Y is given as

So, we compute H(X|Y) as

The properties associated with conditional entropy are as follows:
- H(X, Y) = H(Y) + H(X|Y)
- H(X|Y) ≤ H(X), with equality if and only if X and Y are independent.
Sources of Entropy in Computing
Computers can generate random numbers, but did you know that those numbers are not truly random in nature!!
Instead, they are pseudorandom.
So, to overcome this shortcoming, some of the sources to derive entropy are as follows:
✅ Audio and Video data
✅ The timing of the interrupts from storage devices like hard-drive, etc.
✅ Mouse movements
✅ Timings of keyboard inputs
✅ Timing of network packets
✅ Circuit Voltage
✅ Thermal Readings
Frequently Asked Questions
What is low entropy in cryptography?
Low entropy data provides the ability or possibility to predict forthcoming generated values.
Does encryption change entropy?
Both compression and encryption schemes increase the entropy of plaintext because there are not as many patterns present as in natural language.
Why is Minimum entropy significant in cryptography?
The min-entropy is a widely used metric to quantify the randomness of generated random numbers in cryptographic applications; it measures the difficulty of guessing the most likely output.
What does entropy mean in security?
A measure of the amount of uncertainty an attacker faces to determine the value of a secret.
Conclusion
In this article, we learned about entropy in cryptography, the importance of entropy, how to calculate entropy, and its properties. Finally, we also saw some of the sources of entropy used in the real world.
We hope this blog has helped you enhance your knowledge of Entropy in Cryptography.
Check out these fantastic blogs on Cryptanalysis -
🎯 What is Linear Cryptanalysis?
🎯 Substitution-Permutation Networks (SPN) in Cryptography
🎯 Difference Between Differential and Linear Cryptanalysis
Refer to our guided paths on Coding Ninjas Studio to learn more about DSA, Competitive Programming, JavaScript, System Design, etc. Enroll in our courses and refer to the mock test and problems available, interview puzzles, take a look at the interview experiences, and interview bundle for placement preparations.
Do upvote our blog to help other ninjas grow.
Happy Learning!!