Current location - Training Enrollment Network - Mathematics courses - Calculation of channel capacity
Calculation of channel capacity
Calculation of channel capacity: C = B * log2( 1+ SNR)

The maximum information rate that a channel can transmit without errors. For a single-user channel with only one source and one sink, it is a number in bits per second or bits per symbol. It represents the maximum amount of information that can be transmitted per second or per channel symbol, or the information rate less than this number can be transmitted in this channel without errors.

For multi-user channel, when there are two sources and two destinations, it is a closed line on the plane, as shown in the figure, OC 1ABC2O. Coordinates R 1 and R2 are the information rates that can be transmitted by the two sources respectively, that is, when R 1 and R2 fall within this closed line, they can be transmitted without errors. When there are m sources and sinks, the channel capacity will be the external "surface" of the convex region in m-dimensional space.

Information theory does not study the physical process of signal transmission in channels. It assumes that the transmission characteristics of the channel are known, so the channel can be described by an abstract mathematical model.

In information theory, the channel is usually expressed as {X, P(Y|X), Y}, that is, input random variable x, output random variable y and conditional probability distribution P(Y|X) when the input is known. According to whether the statistical characteristics of the channel change with time, it is divided into:

Constant parameter channel (stationary channel): the statistical characteristics of the channel do not change with time. In a sense, the satellite communication channel can be approximated as a constant channel.

Parametric channel (non-stationary channel): the statistical characteristics of the channel change with time. For example, in shortwave communication, its channel can be regarded as parameter-dependent channel.

Channel capacity is a parameter of the channel, which reflects the maximum amount of information that the channel can transmit, and its size has nothing to do with the source. For different input probability distributions, mutual information must have a maximum value. We define this maximum as the capacity of the channel. Once the transition probability matrix is determined, the channel capacity is completely determined.

Although the definition of channel capacity involves the input probability distribution, the value of channel capacity has nothing to do with the input probability distribution. We call different input probability distributions test sources, and the mutual information of different test sources is different. There must be a test source to maximize mutual information. This maximum value is the channel capacity.

Channel capacity is sometimes expressed as the number of bits that can be transmitted in a unit time (called the data transmission rate and bit rate of the channel), expressed in the form of bits per second (b/s), abbreviated as bps.

The purpose of communication is to obtain information. To measure the amount of information, we use the concept of entropy. In the process of signal transmission through the channel, we involve two entropies: the source entropy of the transmitter, that is, the uncertainty of the originating source, and the source entropy of the receiver, that is, the uncertainty of the originating source under the condition of receiving the signal.