Urgent! Please talk about your understanding of "entropy". Thank you!
Interpretation of entropy quotient 1: Physically, it refers to the quotient obtained by dividing heat energy by temperature, indicating the degree to which heat is converted into work. 2. Functions used to describe and characterize system disorder in science and technology. Social science also uses it to compare the degree of certain States of human society. 3. Entropy is a biological order and a mental behavior phenomenon. Scientists invented the measurement of disorder quantity, which is called entropy. Entropy is also the degree of chaos and the total amount of internal disorder structure. Entropy Entropy refers to the degree of system chaos, which has important applications in cybernetics, probability theory, number theory, astrophysics, life science and other fields. It has more specific definitions in different disciplines and is a very important parameter in various fields. Entropy was put forward by rudolf clausius and applied to thermodynamics. Later, claude elwood shannon introduced the concept of entropy into information theory for the first time. [Edit this paragraph] 1850, German physicist rudolf clausius first put forward the concept of entropy, which was used to express the uniformity of any kind of energy distribution in space. The more uniform the energy distribution, the greater the entropy. When the energy of a system is completely evenly distributed, the entropy of the system reaches the maximum. According to Clausius, in a system, if it is allowed to develop naturally, the energy difference always tends to be eliminated. Let a hot object come into contact with a cold object, and the heat will flow in the following ways: the hot object will cool, and the cold object will get hot until the two objects reach the same temperature. When Clausius studied Carnot heat engine, according to Carnot theorem, he got a formula suitable for any cycle process: dS=(dQ/T). For adiabatic process q = 0, so S≥0, that is, the entropy of the system remains unchanged in reversible adiabatic process and monotonously increases in irreversible adiabatic process. This is the principle of entropy increase. Since all changes in an isolated system have nothing to do with the outside world, it is an adiabatic process, so the principle of entropy increase can also be expressed as: the entropy of an isolated system will never decrease. It shows that the entropy of isolated system increases monotonously with the movement from non-equilibrium state to equilibrium state, and reaches the maximum when the system reaches equilibrium state. The change and maximum of entropy determine the direction and limit of isolated system process, and the principle of entropy increase is the second law of thermodynamics. 1948, Shannon published Mathematical Theory of Communication in Bell System Technology Journal, which introduced the concept of entropy into information theory. [Edit this paragraph] The origin of entropy function The first law of thermodynamics is the law of energy conservation and transformation, but it does not involve whether the process of energy transformation can be carried out spontaneously and to what extent. The second law of thermodynamics is a law to judge the direction and limit of spontaneous process, which has different expressions: heat cannot be spontaneously transferred from a low-temperature object to a high-temperature object; Heat cannot be transferred from a low-temperature object to a high-temperature object without causing other changes; It is impossible to take heat from a single heat source and convert it into work without other changes; The second perpetual motion machine is impossible to cause. The second law of thermodynamics is a summary of human experience, which cannot be deduced from other more general laws, but so far it has not violated its experimental facts. This is one of the basic laws of nature. Since the direction and boundary of all thermodynamic changes (including phase change and chemical change) can be attributed to the mutual transformation of heat and work and its transformation boundary, then a universal thermodynamic function can be found to judge the direction and boundary of spontaneous process. It can be imagined that this function is a state function and a discriminant function (signed difference), which can quantitatively explain the trend of spontaneous process. This state function is an entropy function. If any reversible cycle is divided into many small Carnot cycles, it can be concluded that ∑ (Δ qi/ti) r = 0 (1), that is, the sum of thermal Wen Shang of any reversible cycle is zero. Where Δ qi is the heat exchange between the system and the environment in any infinitesimal reversible cycle; Ti is the temperature of the system in any infinitesimal reversible cycle. The above formula can also be written as ∮(δQR/t)= 0(2). Clausius summed up this rule, and called this state function "entropy", which was expressed by S, that is, DS = Δ QR/t (3). For irreversible processes, DS >ΔQR/t (4) or DS-Δ QR/t > 0 can be obtained. For any process (including reversible and irreversible processes), there is ds-Δ q/t ≥ 0 (6): the equal sign applies to irreversible processes and the equal sign applies to reversible processes. Because the irreversible process is the common feature of all spontaneous processes, every step of the reversible process is infinitely close to the equilibrium state, which is the limit that the irreversible process can reach. Therefore, the above formula can also be used as a criterion to judge whether this process is spontaneous, which is called "entropy criterion". For adiabatic process, Δ δq = 0, and if substituted into the above formula, DSJ ≥ 0 (7). Therefore, in the adiabatic process, the entropy of the system will never decrease. Among them, for reversible adiabatic process, DSJ = 0, that is, the entropy value of the system remains unchanged; For irreversible adiabatic process, dsj > 0, that is, the entropy value of the system increases. This is the principle of entropy increase, which is the mathematical expression of the second law of thermodynamics, that is, under the condition of isolation or adiabatic, the direction of spontaneous process of the system is always the direction of entropy increase until the entropy reaches the maximum, at which time the system reaches an equilibrium state. [Edit this paragraph] The statistical significance of entropy function Boltzmann put forward a formula based on the study of the statistical phenomenon of molecular motion: s = k× ln ω (8), where ω is the state number of the system molecule and k is Boltzmann constant. This formula embodies the statistical significance of entropy function, and links the macroscopic physical quantity S with the microscopic physical quantity ω of the system, which has become one of the important bridges between macro and micro. Based on the relationship between entropy and thermodynamic probability, it can be concluded that the entropy value of the system directly reflects the uniformity of its state, and the smaller the entropy value of the system, the more orderly and uneven its state is; The greater the entropy value of the system, the more disordered and uniform its state is. The system always tries to spontaneously change from a state with a small entropy value to a state with a large entropy value (that is, from order to disorder), which is the microphysical significance of the "principle of increasing entropy value" of the isolation system. [Edit this paragraph] The basic features and entropy are both greater than or equal to zero, that is, H_s \ge 0. Let n be the total number of events in system S, then the entropy H_s \le log_2N. If and only if p 1=p2=...=pn, the equal sign holds, and the entropy is maximum. Joint entropy: H(X, Y) \le H(X)+H(Y), if and only if x and y are statistically independent of each other. Conditional entropy: H(X|Y) = H(X, Y)-H(Y) \le H(X), if and only if x and y are statistically independent. Sociological significance: Macroscopically, it shows the chaotic degree of world and social evolution. [Edit this paragraph] Applied thermodynamic entropy is one of the parameters that characterize the state of matter in thermodynamics, and it is usually represented by the symbol S. In classical thermodynamics, the available increment can be defined as ds = (dq/t), where t is the thermodynamic temperature of matter; DQ is the heat added to matter in the process of entropy increase. The subscript "reversible" means that the change process caused by the heating process is reversible. If the process is irreversible, DS > (DQ/T) is irreversible. Microscopically, entropy is a measure of the disorder degree of a large number of microscopic particles that make up a system. The more disordered and chaotic the system is, the greater the entropy is. The microscopic essence and statistical significance of irreversibility of thermodynamic process is that the system moves from order to disorder, and from a state with less probability to a state with greater probability. The entropy per unit mass of matter is called specific entropy, and it is recorded as S. Entropy is originally a state parameter of matter that reflects the irreversibility of spontaneous process according to the second law of thermodynamics. The second law of thermodynamics is a law summarized from a large number of observation results, which has the following statements: ① Heat is always transferred from a high-temperature object to a low-temperature object, and it is impossible to carry out the opposite transfer without causing other changes; (2) Work can be completely converted into heat, but no heat engine can completely and continuously convert the received heat into work (that is, it is impossible to make a perpetual motion machine of the second kind); ③ In an isolated system, the actual process always increases the entropy of the whole system, which is the principle of entropy increase. Friction irreversibly converts a part of mechanical energy into heat energy, which increases entropy. Heat dQ is transferred from a high-temperature (T 1) object to a low-temperature (T2) object. The entropy of high-temperature objects decreases dS 1=dQ/T 1, and the entropy of low-temperature objects increases dS2=dQ/T2. When these two objects are combined into a system, the change of entropy is DS = DS2+DS 1. Physicist Boltzmann defines entropy as the probability of a special state: the number of atomic aggregation modes. It can be accurately expressed as: S=KlogW K is a proportional constant, which is now called Boltzmann constant. Philosophy of science, science and technology generally refer to the possibility degree of some material system states and the possibility degree of some material system states. Social science also uses it to compare the degree of certain States of human society. Entropy is a measure of the sum of energy that can no longer be converted into work. The name was given by the German physicist rudolf clausius [rudolf clausius (1822- 1888)], who was one of the founders of thermodynamics. It was first built in 1868. But the young French officer Shadi Kano (1796-1832) is generally translated as Kano, and French physicists and engineers put forward the "Kano cycle" theorem in the process of studying the efficiency of heat engines. But the entropy principle was discovered 4 1 year earlier than Clausius. When studying the working principle of the steam engine, Carnot found that the steam engine can do work because one part of the steam engine system is very cold and the other part is very hot. In other words, to convert energy into work, there must be differences in energy concentration between different parts of a system (that is, temperature difference). When energy changes from a higher concentration to a lower concentration (or from a higher temperature to a lower temperature), it does work. More importantly, every time the energy changes from one energy level to another, it means that the energy that can do work next time is reduced. For example, the river flows over the dam into the lake. When the river drops, it can be used to generate electricity, drive water wheels or do other forms of work. However, once the water falls to the bottom of the dam, it is in a state where it can no longer do work. Water without any potential energy on the horizontal plane can't even move the smallest wheel. These two different states are called "effective" or "free" energy and "invalid" or "closed" energy respectively. The increase of entropy means the decrease of effective energy. Whenever anything happens in nature, a certain amount of energy will be converted into invalid energy that can no longer do work. The energy converted into an ineffective state constitutes what we call pollution. Many people think that pollution is a by-product of production, but in fact it is only the sum of all the effective energy in the world that is converted into ineffective energy. The energy consumed is pollution. Since according to the first law of thermodynamics, energy can neither be produced nor destroyed, and according to the second law of thermodynamics, energy can only be transformed in one direction, that is, the direction of dissipation, then pollution is synonymous with entropy. It is a certain unit of invalid energy that exists in a certain system. Information theory In information theory, entropy is a measure of uncertainty. Shannon, the founder of information theory, put forward information measure based on probability and statistics model in his book Mathematical Theory of Communication. He defined information as "something used to eliminate uncertainty". The definition of entropy in information theory is as follows: If there are multiple events S = {E 1, ..., En} in a system S, and the probability distribution of each event P = {p 1, ..., pn}, then the message of each event itself is Ie = log2pi (logarithmic base 2, unit is bit) Ie. The unit is Nat /nats) If there are 26 letters in English, if each letter appears in the article on average, the information content of each letter is I _ E =-\ log _ 2 {1\ over26} = 4.7; There are 2500 commonly used Chinese characters. If each Chinese character appears in the article on average, the information content of each Chinese character is I _ e =-\ log _ 2 {1\ over 2500} =1.3. The average message volume of the whole system is H _ S = \ sum _ {I = 6500}. It is also called "entropy" because it has the same form as Boltzmann formula describing thermodynamic entropy in thermodynamics. If the amount of messages in the two systems is the same, such as an article written in different languages, because it is the sum of all the elements of the message, then Chinese characters used in Chinese articles are less than those used in English articles. Therefore, articles printed in Chinese characters are shorter than those printed in letters with smaller overall numbers. Even if a Chinese character occupies the space of two letters, the articles printed in Chinese characters use less paper than those printed in English letters. In fact, the number of times each letter and Chinese character appears in the article is not average, so the actual value is different from the above, but the above calculation is a general concept. The more words are used in writing units, the more information each unit contains. I(A) measures the information provided by the occurrence of event A, which is called the self-information of event A, and P(A) is the probability of the occurrence of event A. If a random experiment has n possible results or a random message has n possible values, their probabilities are p 1, p2, …, pN, then the sum of the self-information of these events: [h =-sum (.