Hurry, only 9 days left to get one year of Unlimited learning for £249.99 £174.99. New subscribers only. T&Cs apply

# What is entropy?

Entropy is a thermodynamic measure of 'disorder', which must always increase — but how can we measure it?

Entropy is a thermodynamic measure of ‘disorder’, which must always increase — but how can we measure it?

## Reversibility

A reversible process is one that can occur in infinitesimal steps – that is, steps that are increasingly small, until they are at a limit where the changes to the system is negligible.

If change is negligible, then the change can be reversed easily, and the system can be restored. Any energy exchanged in the forward reaction is perfectly restore by the backward reaction.

Examples in gases include most expansions and compressions. Examples in chemical reactions include anything happening at an equilibrium position. For example, a solid at its melting/freezing point is in equilibrium with its liquid.

At the freezing/melting point, solid and liquid forms can happily coexist, and moving between them is a reversible process (melting is only irreversible above the melting point, freezing is irreversible below it).

The heat change for a reversible process has an entropy associated with it:

[Delta S=frac{q_{rev}}{T}]

The entropy change is the heat exchanged in the reversible process divided by temperature. This leads to entropy units of kJ K-1 (or molar quantities of kJ K-1 mol-1).

## Absolute values of entropy

Unlike enthalpy and internal energy, absolute values of entropy can be known. The entropy of a perfect crystal is zero at absolute zero. This is the Third Law of Thermodynamics, and is a starting point for defining absolute values of entropy.

A second approach is to define it through a statistical approach. Here, entropy is a function of the number of ways of arranging particles in the system. If there is only one state and one particle, there is only one arrangement.

If there are two states and two particles, there are two arrangements. These arrangements are known as microstates, and entropy is a function of the number of them.

Entropy is defined as the natural log of the number of microstates, multiplied by the Boltzmann constant.

[S=k_BlnOmega]

You should remember that the Boltzmann constant is a ‘per molecule’ version of the gas constant, which is ‘per mole’.

There is a lot more detail on this subject, but this page is here to remind you that entropy is more than just disorder.

Created by

## Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now