The calculation formula of information amount is: I=log2( 1/p), where p is probability and log2 is logarithm with base two. Information quantity refers to the information measure or content needed to select an event from n equal possible events, that is, the minimum number of times to ask "yes or no" in the process of identifying a specific event among n events.
In information theory, it is considered that the messages output by the source are random. That is, you can't be sure what kind of message is sent by the source until you receive the message. The purpose of communication is to let the receiver remove as much doubt (uncertainty) about the source of the message as possible after receiving the message, so the uncertainty removed is actually the amount of information to be transmitted in communication.
1928 R.V.L Hatle first put forward the idea of information quantification, and he defined the logarithm of the number of messages as the amount of information. If a source has m kinds of messages, and each message is generated with equal probability, then the information content of this source can be expressed as I=logm. The in-depth and systematic study of information quantity is the pioneering work of C.E. Shannon from 1948.
In daily life, once rare events happen, it is easy to attract people's attention, while common things don't, that is to say, rare events bring a lot of information.
If described in statistical terms, it means that events with low probability of occurrence have more information. Therefore, the smaller the probability of an event, the greater the amount of information. That is, the amount of information is inversely proportional to the frequency (that is, probability) of events.
Mathematical methods have the following three basic characteristics:
1. First, highly abstract generalization.
2. The second is accuracy, that is, the rigor of logic and the certainty of conclusions.
3. The universality and operability of the application.