Current location - Training Enrollment Network - Mathematics courses - What is "entropy"
What is "entropy"
entropy

entropy

go to court

entropy

Physical meaning: the symbol of the degree of chaos when a substance is in micro-thermal motion.

One of the parameters that characterize the state of matter in thermodynamics is usually represented by the symbol S. In classical thermodynamics, the available increment can be defined as ds = (dq/t), where t is the thermodynamic temperature of matter; DQ is the heat added to matter in the process of entropy increase; The subscript "reversible" means that the change process caused by the heating process is reversible. If the process is irreversible, DS > (DQ/T) is irreversible. The entropy per unit mass of matter is called specific entropy, and it is recorded as S. Entropy is originally a state parameter of matter that reflects the irreversibility of spontaneous process according to the second law of thermodynamics. The second law of thermodynamics is a law summarized from a large number of observation results, which has the following statements: ① Heat is always transferred from a high-temperature object to a low-temperature object, and it is impossible to carry out the opposite transfer without causing other changes; (2) Work can be completely converted into heat, but no heat engine can completely and continuously convert the received heat into work (that is, it is impossible to make a perpetual motion machine of the second kind); ③ In an isolated system, the actual process always increases the entropy of the whole system, which is the principle of entropy increase. Friction irreversibly converts a part of mechanical energy into heat energy, which increases entropy. Heat dQ is transferred from a high-temperature (T 1) object to a low-temperature (T2) object. The entropy of high-temperature objects decreases dS 1=dQ/T 1, and the entropy of low-temperature objects increases dS2=dQ/T2. When these two objects are combined into a system, the change of entropy is DS = DS2-DS 1.

◎ Physically, it refers to the quotient obtained by dividing heat energy by temperature, indicating the degree to which heat is converted into work.

◎ In science and technology, it generally refers to a Liang degree of some material system states and the possible degree of some material system states. Social science also uses it to compare the degree of certain States of human society.

In information theory, entropy is a measure of uncertainty.

1. Only when the energy density in the specific system you use is uneven can energy be converted into work. At this time, energy tends to flow from higher density to lower density until everything becomes uniform. It is through this energy flow that you can get work from energy.

The water level in Heyuan is relatively high, and the potential energy of the water there is also greater than that in the estuary. For this reason, water flows downstream into the ocean. If it hadn't rained, all the water on the mainland would have flowed into the ocean, and the sea level would have risen slightly. At this time, the total potential energy remains unchanged. But the distribution is relatively uniform.

It is when the water flows downwards that the waterwheel can turn and the water can do work. Water on the same horizontal plane can't do work, even on a very high plateau, so the potential energy is extremely high and it can't do work. What is decisive here is the difference of energy density and the flow in the direction of homogenization.

Entropy is a measure of chaos and disorder. The greater the entropy, the greater the degree of disorder. Our universe is a universe with increasing entropy. The second law of thermodynamics embodies this characteristic. Life is highly orderly, and wisdom is highly orderly. Why does life appear in the universe with increasing entropy? Will evolve wisdom? The second law of thermodynamics (negative entropy) also reveals that local order is possible, but it must be at the expense of greater disorder elsewhere. People need energy and food to survive, at the cost of the death of animals and plants (entropy increase). Everything grows on the sun. The order of animals and plants is at the expense of the exhaustion of solar nuclear reaction (entropy increase) or other entropy increase. People are locked in completely closed lead boxes. You can't maintain your negative entropy with the increase of entropy elsewhere. In this relatively closed system, the law of entropy increase destroys the order of life. Entropy is the arrow of time, which is irreversible in this universe. Entropy is closely related to time. If time stops "flowing", there is no increase in entropy. "Anything we know can be locked" is nothing else. It's "time" Low temperature is also "time". Life is an orderly "structure" of matter. "Structure" and concrete matter are not a hierarchical concept, just as building materials and architectural styles are not a hierarchical concept. Biology has proved that no atom in the body of anyone who has reached the age of surfing the Internet is born. However, you were born. Life goes on. Dead people can keep molecules from being metabolized in the body for a long time. Consciousness is more orderly than life and can be transmitted between lives. Having said that, I think the hierarchical relationship between matter and consciousness should be clear. The word "materialism" is quoted here because it is not thorough. Why entropy reduction is the essence of this universe cannot be answered. (Excerpted from People's Daily BBS Forum)

This is true of any kind of energy. In a steam engine, there is a hot storage to turn water into steam, and a cold storage to condense steam into water. It is this temperature difference that plays a decisive role. At any single and undifferentiated temperature, no matter how high, it is impossible to get any works.

Entropy is a term coined by the German physicist Clausius (1822–1888) in 1850. He used it to express the uniformity of any energy distribution in space. The more uniform the energy distribution, the greater the entropy. If the energy distribution of the system we consider is completely uniform, then the entropy of this system will reach the maximum.

According to Clausius, in a system, if it is allowed to develop naturally, the energy difference always tends to be eliminated. Let a hot object come into contact with a cold object, and the heat will flow in the following ways: the hot object will cool, and the cold object will get hot until the two objects reach the same temperature. If two reservoirs are connected, and the water level of one reservoir is higher than the other, gravity will lower the water level of one reservoir and raise the water level of the other reservoir until the water levels of the two reservoirs are equal and the potential energy is uniform.

Therefore, Clausius said, a universal law of nature is that the differences in energy density tend to be equal. In other words, "entropy will increase with time".

In the past, the research on the flow of energy from a higher density place to a lower density place was mainly carried out in the form of heat energy. Therefore, the science of energy flow and work-energy conversion is called "thermodynamics", which comes from the Greek word "thermal motion".

People have long concluded that energy can neither be created nor destroyed. This is the most basic law; So people call it "the first law of thermodynamics".

Clausius's theory that entropy increases with time seems to be almost a very basic universal law, so it is called "the second law of thermodynamics".

2. Entropy in information theory: the unit of information measurement: Shannon, the founder of information theory, put forward information measurement in his book "Mathematical Theory of Communication", which is based on probability and statistical model. He defined information as "something used to eliminate uncertainty".

Shannon formula: I(A)=-logP(A)

I(A) measures the information provided by the occurrence of event A, which is called the self-information of event A, and P(A) is the probability of the occurrence of event A. If a random experiment has n possible results or a random message has n possible values, if their probabilities are p 1, p2, …, pN, then the average of the self-information of these events:

H=-SUM(pi*log(pi)), i= 1, 2…N .h is called entropy.