Information Content, Surprise and Probability [DL 103]

Hello Sir,

I’m going through the course and I must say that this is one the best that I have come across

While going through the Information Content module, you introduce it by saying that IC is proportional to Surprise [or to inverse of Probability]. The more the Surprise, less is the Probability and hence more the IC. During the end of the video when you explain the formula using Log, you mention that more the Surprise, less the IC. This I thought is contradictory to the introduction you provided. Could you please provide some clarity on this

1 Like

You are right, IC associated with an event is proportional to surprise and hence inversely proportional to its probability.
While constructing a mathematical formula for IC, ‘log’ is introduced as it help representing the concept that IC for two independent event is basically the sum of IC associated with individual event!
But log representation doesn’t contradicts IC/surprise intuition. Can you point which video clip and where in that video Confusion arises?


1 Like

This is in Information Theory and Cross Entropy: Information Content [DL#103 - Sigmoid Neuron]

OK, now I noticed some discripency in use of word ‘proportional’ too.
‘more surprise means more IC’ gives the correct intuition and Mathematical defn for IC supports its too.

It’s safe to ignore ‘more surprise, less IC’ statement.