The hamming distance of two vectors is the number of elements of that differ from .
In Python it can be calculated:
def hamming_distance(a, b): if len(a) != len(b): raise ValueError("a must be the same length than b") return sum(x != y for x, y in zip(a, b))
Supose a discrete random variable, We call the mass function of defined by:
The mass function is the analog of the density function were continuous random variable.
Entropy in information theory (also called Shanon Entropy) is a generalization of Thermodynamics Entropy (Boltzmann Entropy).
The entropy of a discrete random variable with possible values , with a probability mass function is defined by: