meaning of entropy

1. A certain property of a body, expressed as a measurable quantity, such that when there is no communication of heat the quantity remains constant, but when heat enters or leaves the body the quantity increases or diminishes. If a small amount, h, of heat enters the body when its temperature is t in the thermodynamic scale the entropy of the body is increased by h / t. The entropy is regarded as measured from some standard temperature and pressure. Sometimes called the thermodynamic function.
2.
entropy A measure of the disorder of a system. Systems tend to go from a state of order low entropy to a state of maximum disorder high entropy. The entropy of a system is related to the amount of information it contains. A highly ordered system can be described using fewer bits of information than a disordered one. For example, a string containing one million "0"s can be described using run-length encoding as ["0", 1000000] whereas a string of random symbols e. g. bits, or characters will be much harder, if not impossible, to compress in this way. Shannons formula gives the entropy HM of a message M in bits: HM = -log2 pM Where pM is the probability of message M.
3.
thermodynamics a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work; "entropy increases as matter and energy in the universe degrade to an ultimate state of inert ">uniformity"


Related Words

entropy |

Developed & Maintained By Taraprasad.com

Treasure Words