Definition of entropy in English:

entropy

noun

  • 1Physics
    A thermodynamic quantity representing the unavailability of a system's thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system.

    • ‘It will occur because according to the second law of thermodynamics, the amount of entropy in a system must always increase.’
    • ‘Then it recovers its investment by letting the sodium back in, so increasing entropy, and converting that change in entropy to free energy used to turn the rotor.’
    • ‘Despite the large increase in enthalpy and entropy, the free energy difference between the closed and open conformations is relatively small.’
    • ‘In Chapter 3 we discussed how the thermodynamic arrow of entropy increase is a reflection of the relative probabilities of various states.’
    • ‘The enthalpy, entropy, and free energy changes in the opening reaction of each basepair are determined from the temperature dependence of the exchange rates.’
  • 2Lack of order or predictability; gradual decline into disorder.

    ‘a marketplace where entropy reigns supreme’
    • ‘Patrick's final resting place wasn't quite as romantic as she'd envisioned it, but after a day in this town where entropy seemed to reign unchecked, she was unsurprised.’
    • ‘Lyrically and musically, the album's tone of entropy does more to underscore the miasma of dread most people feel under the current political conditions than it does to rebel significantly against it.’
    • ‘People have a natural tendency to rebel against entropy to return order to their environments.’
    • ‘Designers, even more than artists, are battlers against entropy.’
    • ‘He was unable to arrest the gradual entropy that had set in.’
    disorder, disarray, disorganization, disorderliness, untidiness, chaos, mayhem, bedlam, pandemonium, madness, havoc, turmoil, tumult, commotion, disruption, upheaval, furore, frenzy, uproar, babel, hurly-burly, maelstrom, muddle, mess, shambles
    View synonyms
  • 3(in information theory) a logarithmic measure of the rate of transfer of information in a particular message or language.

    • ‘But no serious scientist would expect that such a thing were possible, for the simple reason that it would be a violation of the fundamental principles of entropy / information theory.’
    • ‘Changes in entropy or information content based on new constraints are calculated with respect to the appropriate reference state which can, in principle, be related to other reference states.’
    • ‘These functions range from simple bookkeeping tasks to serious number-crunching algorithms such as deconvolution, maximum entropy, Fourier transforms and more.’
    • ‘Information, entropy, and computation become metaphors for us at a much broader level.’
    • ‘One can measure entropy on a scale from zero to one - zero indicating a completely linear system that loses no work and behaves predictably.’

Origin

Mid 19th century: from en- inside + Greek tropē transformation.

Pronunciation:

entropy

/ˈentrəpē/