Shannon entropy

“Suppose we have purchased a box (possibly from Eve!) which generates a string x = x₁, …, xₙ. If the string was uniformly random, then px = 1/2n and Shannon_entropy X = n.”

import Random_variable
def Shannon_entropy (p : ι → ℝ) [rnd_var p] : ℝ :=
- ∑ i, (p i) * log(p i) 
/-- non-negative for discrete random variables -/
theorem Shannon_entropy_nonneg (X : ι → ℝ) [rnd_var X] : 
  Shannon_entropy X ≥ 0 := 
begin
  intros,
  rw Shannon_entropy,
  -- move the minus inside
  simp only [← mul_neg_eq_neg_mul_symm, ← sum_neg_distrib],
  -- each term in the sum is nonnegative
  have H2 : ∀ i, - (X i) * log(X i) ≥ 0, {
    intros
    have h : - (X i) * log(X i) = (X i) * (- log(X i)), by linarith,
    rw h,
    apply mul_nonneg,
    {apply probs_nonneg},
    {
      rw neg_nonneg,
      apply log_nonpos,
      {apply probs_nonneg},
      {apply probs_le_one},
    },
  },
  -- so we have that the whole sum is nonneg
  apply sum_nonneg,
  simp only [neg_mul_eq_neg_mul_symm, ge_iff_le, mul_neg_eq_neg_mul_symm, neg_nonneg] at *,
  intros i hi,
  exact H2 i,
end

For a random variable with Shannon_entropy \approx n/2, where n is th enumber of bits; Or the cardinality of range(the random variable).

The the distribution produces 11….1 with prob 1/2.

“But is there a lot of uncertainty for Eve? Note that the probability that the box generates the string x = 11 . . . 1 is 1/2, independent of the length of the string! This means that whenever we use x as an ecryption key, Eve will be able to guess the key, and thus decrypt the message with probability 1/2. Eve’s probability of guessing is extremely large, even when we send a very large message.”


def min_entropy [rnd_var X] : ℝ :=
-log max{range X}

For the example above min_entropy(X) = -log 1/2 = 1.

“the min-entropy is constant!”

“min-entropy captures what it means for Eve to be uncertain about x: Eve could guess the string with probability 1/2. In general, we would all guess the most likely string,/ and the probability that we are correct is precisely Pguess(X) = maxx px. The min-entropy thus has as very neat operational interpretation.”

References

  • edX Quantum Cryptography Lecture 3 notes;