joint entropy

Hello, you have come here looking for the meaning of the word joint entropy. In DICTIOUS you will not only get to know all the dictionary meanings for the word joint entropy, but we will also tell you about its etymology, its characteristics and you will know how to say joint entropy in singular and plural. Everything you need to know about the word joint entropy you have here. The definition of the word joint entropy will help you to be more precise and correct when speaking or writing your texts. Knowing the definition ofjoint entropy, as well as those of other words, enriches your vocabulary and provides you with more and better linguistic resources.

English

English Wikipedia has an article on:
Wikipedia

Noun

joint entropy (countable and uncountable, plural joint entropies)

  1. (information theory) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts.
    If random variables and are mutually independent, then their joint entropy is just the sum of its component entropies. If they are not mutually independent, then their joint entropy will be where is the mutual information of and .