Current location - Training Enrollment Network - Mathematics courses - What is the abbreviation of BIT in computer field?
What is the abbreviation of BIT in computer field?
Two concepts of information unit (bit);

1) Computer terminology is a kind of information unit, which is transliterated from English BIT. One bit of a binary number contains one bit of information, for example, the binary number 0 10 1 is four bits.

2) A bit in a binary number is the smallest unit of information. In digital acoustics, audio signals are represented by electric pulses, "1" represents pulses, and "0" represents pulse intervals. If the information of each point on the waveform is represented by a four-bit code, it is called four bits. The higher the number of bits, the more accurate the expression of analog signals and the stronger the ability to restore audio signals.

Bits in a computer

In the binary number system, every 0 or 1 is a bit, and one bit is the smallest unit of data storage. In which 8 bits are called a byte. The number of CPU bits in a computer refers to the maximum number of bits that the CPU can handle at a time. For example, the CPU of a 32-bit computer can process up to 32-bit data at a time.

Bit, the abbreviation of binary number, is a term put forward by mathematician John Wilde Tukey (it may have been put forward by 1946, but some data say it was put forward by 1943). The first formal use of this term was in Shannon's famous Theory of Information, that is, A Mathematical Theory of Communication on page 1.

Suppose an event occurs in the form of A or B, and the probability of A and B is equal, both of which are 0.5. You can use a binary to represent one of A or B ... For example:

1) binary can be used to express simple affirmative/negative judgment.

2) a switch with two states (such as a light switch),

3) On and off of the triode,

4) Whether there is voltage on the wire, or

5) The logic of the snapshot is yes/no, and so on.

Because the length will change after being converted into binary, the information of one bit in different number systems is not always a binary, and its corresponding relationship is logarithmic, such as one bit in octal, which is equivalent to three binaries. In addition to binary, there are octal, decimal and hexadecimal commonly used in computers.