Â
In this we are going to see a basic example of Calculating self information, source entropy and Average Information rate using python.
import math
px1 = float(input("Enter the first Probability: "))
px2 = float(input("Enter the second Probability: "))
px3 = float(input("Enter the third Probability: "))
px4 = float(input("Enter the fourth Probability: "))
rs = int(input("Enter the Symbol Rate: "))
print("Probabilities are : \n", "px1 = ", px1, "\n",
"px2 = ", px2, "\n", "px3 = ", px3, "\n", "px4 = ", px4, "\n")
Ix1 = round(math.log2(1/px1), 3)
Ix2 = round(math.log2(1/px2), 3)
Ix3 = round(math.log2(1/px3), 3)
Ix4 = round(math.log2(1/px4), 3)
print("Informations are : \n", "Ix1 = ", Ix1, "\n", "Ix2 = ",
Ix2, "\n", "Ix3 = ", Ix3, "\n", "Ix4 = ", Ix4)
Hx = ((px1*Ix1) + (px2*Ix2) + (px3*Ix3) + (px4*Ix4))
print("\nEntropy is : ", round(Hx, 3))
R = Hx * rs
print("\nAverage information rate is : ", round(R, 3), "\n")
#ENJOY CODING
Post a Comment
FOR ANY DOUBTS AND ERRORS FEEL FREE TO ASK. YOUR DOUBTS WILL BE ADDRESSED ASAP