Definition of entropy in US English:



  • 1Physics
    A thermodynamic quantity representing the unavailability of a system's thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system.

    • ‘The enthalpy, entropy, and free energy changes in the opening reaction of each basepair are determined from the temperature dependence of the exchange rates.’
    • ‘Then it recovers its investment by letting the sodium back in, so increasing entropy, and converting that change in entropy to free energy used to turn the rotor.’
    • ‘Despite the large increase in enthalpy and entropy, the free energy difference between the closed and open conformations is relatively small.’
    • ‘In Chapter 3 we discussed how the thermodynamic arrow of entropy increase is a reflection of the relative probabilities of various states.’
    • ‘It will occur because according to the second law of thermodynamics, the amount of entropy in a system must always increase.’
  • 2Lack of order or predictability; gradual decline into disorder.

    ‘a marketplace where entropy reigns supreme’
    • ‘Patrick's final resting place wasn't quite as romantic as she'd envisioned it, but after a day in this town where entropy seemed to reign unchecked, she was unsurprised.’
    • ‘Designers, even more than artists, are battlers against entropy.’
    • ‘Lyrically and musically, the album's tone of entropy does more to underscore the miasma of dread most people feel under the current political conditions than it does to rebel significantly against it.’
    • ‘People have a natural tendency to rebel against entropy to return order to their environments.’
    • ‘He was unable to arrest the gradual entropy that had set in.’
    disorder, disarray, disorganization, disorderliness, untidiness, chaos, mayhem, bedlam, pandemonium, madness, havoc, turmoil, tumult, commotion, disruption, upheaval, furore, frenzy, uproar, babel, hurly-burly, maelstrom, muddle, mess, shambles
    View synonyms
  • 3(in information theory) a logarithmic measure of the rate of transfer of information in a particular message or language.

    • ‘Information, entropy, and computation become metaphors for us at a much broader level.’
    • ‘Changes in entropy or information content based on new constraints are calculated with respect to the appropriate reference state which can, in principle, be related to other reference states.’
    • ‘These functions range from simple bookkeeping tasks to serious number-crunching algorithms such as deconvolution, maximum entropy, Fourier transforms and more.’
    • ‘One can measure entropy on a scale from zero to one - zero indicating a completely linear system that loses no work and behaves predictably.’
    • ‘But no serious scientist would expect that such a thing were possible, for the simple reason that it would be a violation of the fundamental principles of entropy / information theory.’


Mid 19th century: from en- ‘inside’ + Greek tropē ‘transformation’.