Hello, you have come here looking for the meaning of the word
information entropy. In DICTIOUS you will not only get to know all the dictionary meanings for the word
information entropy, but we will also tell you about its etymology, its characteristics and you will know how to say
information entropy in singular and plural. Everything you need to know about the word
information entropy you have here. The definition of the word
information entropy will help you to be more precise and correct when speaking or writing your texts. Knowing the definition of
information entropy, as well as those of other words, enriches your vocabulary and provides you with more and better linguistic resources.
English
Noun
information entropy (uncountable)
- (information theory) A measure of the uncertainty associated with a random variable; a measure of the average information content one is missing when one does not know the value of the random variable (usually in units such as bits); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters.
Synonyms
Translations
Translations
- Chinese:
- Mandarin: 信息熵 (xìnxī shāng), 資訊熵/资讯熵 (zīxùn shāng)
|
See also