Home/Entropy

Entropy is a measurement of the number of degrees of freedom a system has. When the entropy goes up it requires more information to describe the state of the system and it would require work to be done in order to reconfigure the system into its original ordered state. As such entropy is a key measure in information theory where it quantifies the uncertainty involved in predicting the value of a random variable. The Second Law of Thermodynamics is an observation of the fact that over time, differences in temperature, pressure, and chemical potential tend to even out in a physical system that is isolated from the outside world. Entropy is a measure of how much this process has progressed. The entropy of an isolated system that is not in equilibrium tends to increase over time, approaching a maximum value at equilibrium. Thus the second law is one of the few if not only physical laws the differentiates between the direction of time.

2016-10-14T15:23:24+00:00