entropy

Hello, you have come here looking for the meaning of the word entropy. In DICTIOUS you will not only get to know all the dictionary meanings for the word entropy, but we will also tell you about its etymology, its characteristics and you will know how to say entropy in singular and plural. Everything you need to know about the word entropy you have here. The definition of the word entropy will help you to be more precise and correct when speaking or writing your texts. Knowing the definition ofentropy, as well as those of other words, enriches your vocabulary and provides you with more and better linguistic resources.

English

A user suggests that this English entry be cleaned up, giving the reason: “Are all of these definitions really distinct?”.
Please see the discussion on Requests for cleanup(+) or the talk page for more information and remove this template after the problem has been dealt with.

Etymology

English Wikipedia has an article on:
Wikipedia

First attested in 1867, as the translation of German Entropie, coined in 1865 by Rudolph Clausius in analogy to Energie (energy), replacing the root of Ancient Greek ἔργον (érgon, work) by Ancient Greek τροπή (tropḗ, transformation)).

Pronunciation

  • IPA(key): /ˈɛntɹəpi/
  • (file)

Noun

English Wikipedia has an article on:
Wikipedia

entropy (countable and uncountable, plural entropies)

  1. A measure of the disorder present in a system.
    1. (Boltzmann definition) A measure of the disorder directly proportional to the natural logarithm of the number of microstates yielding an equivalent thermodynamic macrostate.
    2. (information theory) Shannon entropy
  2. (thermodynamics, countable) A measure of the amount of energy in a physical system that cannot be used to do work.
  3. The capacity factor for thermal energy that is hidden with respect to temperature.
  4. The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature.
  5. (statistics, information theory, countable) A measure of the amount of information and noise present in a signal.
  6. (uncountable) The tendency of a system that is left to itself to descend into chaos.

Synonyms

Antonyms

Derived terms

Translations

The translations below need to be checked and inserted above into the appropriate translation tables. See instructions at Wiktionary:Entry layout § Translations.

See also

References

  1. ^ Clinton D. Stoner (2001 November 7) “Inquiries into the Nature of Free Energy and Entropy in Respect to Biochemical Thermodynamics”, in arXiv, →DOI, arXiv: physics/0004055
  2. ^ Frank Lambert (2006 February) “A Student’s Approach to the Second Law and Entropy”, in Entropy Site, archived from the original on 2006-07-02

Further reading

Anagrams