## Table of Contents

- Entropy Definition
- Development of Entropy Concept
- Entropy Equation
- Applications of Entropy
- Other Uses of Entropy
- Lesson Summary

What is entropy? Learn the definition and formula of entropy. Also learn about entropy specific to gases, examples that use the entropy formula, and applications.
Updated: 08/28/2021

- Entropy Definition
- Development of Entropy Concept
- Entropy Equation
- Applications of Entropy
- Other Uses of Entropy
- Lesson Summary

Entropy is the measure of how disorganized and random a system is. It is related to the **Second Law of Thermodynamics**. The Second Law of Thermodynamics states that unless outside energy is provided, a system will find its entropy (disorder) staying the same or increasing as time goes on. In other words, a system will never get more ordered without outside intervention.

The concept of entropy was developed in the early 1850s by Rudolf Clausius due to a response to an observation that a certain amount of functional energy released from **combustion reactions** is lost to dissipation or friction, thereby not transferring the energy into useful work. Clausius put forth the concept of the **thermodynamic **system and argued that, in an irreversible process, a small amount of heat energy is incrementally dissipated across the system boundary. He continued to develop the ideas of lost energy and coined the term entropy.

In equations, entropy is usually denoted by the letter S and has units of joules per kelvin {eq}\left ( J \cdot K^{-1} \right ) {/eq} or {eq}kg \cdot m^{2} \cdot s^{^{-2}} \cdot K^{-1} {/eq}. A highly ordered system has low entropy.

During entropy, the process is defined as the amount of heat emitted or absorbed **isothermally** and reversibly divided by the absolute temperature. If we add the same quantity of heat at a higher temperature and lower temperature, randomness will be maximum at a lower temperature. This suggests that the temperature is inversely proportional to the entropy.

A campfire is an example of entropy. The solid wood burns and becomes ash, smoke, and gases, all of which spread energy outward more easily than the solid fuel.

Here are two examples that use the entropy formula to find a solution.

Combustion Reaction: {eq}C_{2}H_{8}(g))+ 5O_{2}(g) \rightarrow 3CO_{2}(g)+ 4H_{2}O(g) {/eq}

{eq}\Delta H = -2045kJ {/eq}

The reaction is **exothermic **so there is an increase in the surrounding entropy. An exothermic reaction is indicated by a positive

{eq}\Delta S {/eq} value. This means heat was released to the surrounding area or that the environment gained energy. This is a positive change in entropy. Although the step by step reactions are not shown here, the change in entropy of the surroundings of reaction is {eq}\Delta Ssurr {/eq}= 6.86 kJ/K or 6860 J/K.

Endothermic Reaction: {eq}H_{2}O(l)\rightarrow H_{2}O(g) {/eq}

{eq}\Delta H = ^{+}44kJ {/eq}

This reaction needed energy from the surroundings to proceed and reduced the entropy of the surroundings. A negative ?S value indicates an **endothermic** reaction occurred, which absorbed heat from the surroundings. Although the step by step reactions are not shown here, the change in entropy of the surroundings is {eq}\Delta Ssurr {/eq} = -0.15 kJ/K or -150 J/K

Almost every chemical and biological process is governed by changes in entropy (and **free energy**). Here is a real-world example that should help you understand this concept.

In **bioenergetics**, many of the reactions that take place in living organisms require a source of free energy to drive them. The immediate source of this energy in heterotrophic organisms, which include animals, fungi, and most bacteria, is the sugar **glucose**. **Oxidation** of glucose to carbon dioxide and water is accompanied by a large negative free energy change. It can be shown in the following equation:

{eq}C_{6}H_{12}C_{6} + 6O_{2}\rightarrow 6CO_{2}+ 6H_{2}O \Delta G^{^{0}} = -2,880 kJmol^{^{-1}} {/eq}

Of course, it would not do to simply burn the glucose in the normal way; the energy change would be wasted as heat, and rather too quickly for the well-being of the organism! Effective utilization of this free energy requires a means of capturing it from the glucose and then releasing it in small amounts when and where it is needed. This is accomplished by breaking down the glucose in a series of a dozen or more steps in which the energy liberated in each stage is captured by an energy carrier molecule, of which the most important is **adenosine diphosphate**, known as ADP. At each step in the breakdown of glucose, an ADP molecule reacts with inorganic phosphate and changes into ATP:

Where does the glucose come from? Animals obtain their glucose from their food, especially cellulose and starches that, like glucose, have the empirical formula ({eq}CH_{2}O {/eq}}. Animals obtain this food by eating plants or other animals. Ultimately, all food comes from plants, most of which are able to make their own glucose from {eq}CO_{2} {/eq}and {eq}H_{2}O {/eq} through the process of **photosynthesis.**

The **First Law of Thermodynamics** states that energy can never be generated or destroyed but can only change into different forms. The First Law of Thermodynamics relates to the principles of entropy because, within a system, the entropy of one system can decrease by raising the entropy of another system. Therefore, the entropy of the system is neither generated nor destroyed, but changed into a different form.

Where kB is the Boltzmann constant (also written as k) and is equal to 1.38065 * {eq}10^{-23} {/eq} J/K, it shows the relationship between entropy and the number of ways the atoms or molecules of a thermodynamic system can be arranged. It links the entropy of a system (macroscopic quantity) to the number of equilibrium microscopic states that the system allows. The system is a constant (number of particles (N), Volume (V), total energy (E), or constant (N, V, T). The Boltzmann equation links the macroscopic worlds.

The number of microstates corresponding to a given macrostate is called the multiplicity g(E,V). Multiplicity becomes a fundamentally important quantity since the macrostate with the highest multiplicity is the most probable macrostate.

For gases, there are two possible ways to evaluate the change in entropy. We begin by using the First Law of Thermodynamics dE = dQ - dW where E is the internal energy and W is the work done by the system. Substituting for the definition of work for a gas. dQ = dE + p dV where p is the pressure and V is the volume of the gas. During a thermodynamic process, the temperature T of an object changes as heat Q is applied or extracted. A more correct definition of the entropy S is the differential form that accounts for this variation. The change in entropy is then the inverse of the temperature integrated over the change in heat transfer. With regards to pressure (denoted here as C), if pressure increases, a negative contribution is made on the change in entropy of an ideal gas, but depending on the change in temperature, the actual change in entropy for the system might be positive or negative.

The change in entropy of individual gases can be calculated by considering that each individual gas exists alone and is expanded from the initial state to the temperature and volume of the mixture, the pressure at the final state being the partial pressure of the constituent in the mixture.

The Law of Entropy can affect other laws and concepts. Below are some examples of how it does just that.

- Entropy and time. Entropy is one of the few concepts that provide evidence for the existence of time. The concept of the Arrow of Time is a name given to the idea that time is asymmetrical and flows in only one direction, forward. It is the non-reversible process where entropy increases.
- Entropy in business and economics. Unfortunately, most businesses fail within the first 18 months of going into business. How does this relate to entropy? Entropy is fundamentally a probabilistic idea, for every possible useful ordered state of molecules, there are just as many more possible disordered states. Just as energy tends toward a less useful, more disordered state, so do businesses and organizations. Rearranging the molecules, or businesses systems and people, into an ordered state requires an injection of outside energy.
- Entropy and the Laws of Thermodynamics. As we have learned, entropy has a relationship with both the 1st and 2nd Laws of Thermodynamics. It also has a relationship with the
**3rd Law of Thermodynamics**. The 3rd Law of Thermodynamics states that a system's entropy approaches a constant value as the temperature approaches absolute zero. As we have also learned, entropy is a measure of the disorder or randomness in a closed system. It is directly related to the number of microstates (a fixed microscopic state that can be occupied by a system), accessible by the system, i.e. the greater the number of microstates the closed system can occupy, the greater its entropy.

The **Law of Conservation of Energy** states that energy can neither be created nor destroyed. However, it may be transformed from one form to another. If you take all forms of energy into account, the total energy of an isolated system always remains constant. All forms of energy follow the Law of Conservation of Energy. In other words, the Law of Conservation of Energy states that in a closed system, a system that is isolated from its surroundings, the total energy of the system is conserved.

Another way to look at this is in the terms of the use of entropy. There is no external environment for an isolated system. A closed system can exchange energy but not matter with the external environment.

**Work **is defined as a force causing the movement or displacement of an object. The **principle of maximum entropy** states that the probability distribution which best represents the current state of knowledge about a system is the one with the largest entropy. An example of work and maximum entropy is the application of thermodynamics in the transfer of thermal energy into work in an engine. In general, this cyclic process involves an exchange of heat with two reservoirs, heat in at a high temperature and heat out at a low temperature, resulting in a net positive work from the process.

**Entropy** is an important concept in the world of science. Entropy is closely tied to the **2nd Law of Thermodynamics.** The Second Law of Thermodynamics says that when energy changes from one form to another, or matter moves freely, entropy(disorder) in a closed system increases. Entropy also has a relationship with the **1st and 3rd Laws of Thermodynamics**. Entropy is observed in combustion reactions and endothermic and exothermic reactions. It was the observance of a combustion reaction that led **Rudolf Clausius** and others to develop the concept of entropy.

The concept and formula associated with entropy can be seen and used in the world of science and in everyday life.

To unlock this lesson you must be a Study.com Member.

Create your account

Additional Activities

50 kg of water at 20 ° C is mixed with 50 kg water at 24 ° C. Estimate the change in entropy without using calculus.

Since we start with equal amounts of water, the final temperature will be 22 ° C. As the hot water cools down from 24 ° C to 22 ° C, an amount of heat equal to Q= mc Δ T = (50 kg ) (1 kcal / kg K) ( 2 ° C) = 100 kcal flows out of the hot water. Heat Q, flows into the cool water as it heats up from 20 ° C to 22 ° C . The total change in entropy is the sum of the change in entropy of the hot water plus the change in entropy of the cold water. We estimate entropy change as Δ S = Q / Tav , where Tav is 23 ° C ( 296 ° K ) for the hot water. The average temperature for the cold water is 21 ° C ( 294 ° K ) . The change in entropy of the hot one is - (100/ 296) kcal / ° K = - 0.338 kcal / ° K. The heat flows out of the hot system, giving the entropy a negative sign. The change in entropy of the cold system is ( 100/ 294 ) kcal/ ° K = 0.340 kcal / ° K . The total change in entropy is ( -0.338 + 0.340 ) (kcal / ° K) = 0.002 kcal / ° K. The total change in entropy is positive.

Are you a student or a teacher?

Already a member? Log In

BackWhat teachers are saying about Study.com

Already registered? Log in here for access

Related Study Materials

Browse by subject