Thursday, December 22, 2011

Information as a measure of uncertainty | Information in communication systems

In my previous post, I gave you Introduction about Information Theory. In this post I am going to tell about
Information , its unit, its uncertainty and mathematical relation.
First coming over to the amount of information contained in a message. Have you ever wondered about this?
Let us take an example : Suppose you are planning a tour a city located in such an area where rainfall very rare.
To know about the weather forecast you will call the weather bureau and may receive one of the following information:
1) It would be hot and sunny.
2) There would be rain.
3) There would be a cyclone.
It may be observed that the amount of information received is clearly different from the three messages. The first message, just for instance, contains very little information because the weather in a desert city in summer is expected to be hot and sunny for most of the time.
The second message i.e. rain contains more information because it is not an event that occurs often.
The third message i.e. cyclonic storms will contain the maximum amount of information as compared to the previous two. This is because the third event rarely happens in the city.
Thus,
There is an Inverse relationship between the probability of an event and information associated with it.

 I(x)=f[1 / p(x)]
Where I (x) represent the information and f means (it a function of ) while 1 / p (x) represents the inverse of probability.

Properties of Information content I (x) :
1)   I (x) = 0  For p (x) =1
2)   I(x) ≥ 0
3)   I (x_i)>1(x_j) if P(x_i)<P(x_j)
4)
NEXT POST : Entropy and its poperties.                                                                              

No comments:

Post a Comment

About The author

My photo
Himanshu Dureja is an engineering student and part time blogger.