Perplexity
In information theory, perplexity is a measure of uncertainty in the value of a sample from a discrete probability distribution. The perplexity of a fair coin toss is 2, and that of a fair die roll is 6; and generally, for a probability distribution with exactly N outcomes each having a probability of exactly 1 / N, the perplexity is simply N. But perplexity can also be applied to unfair dice, and to other non-uniform probability distributions. It can be defined as the exponentiation of the information entropy. The larger the perplexity, the less likely it is that an observer can guess the value which will be drawn from the distribution.
Perplexity was originally introduced in 1977 in the context of speech recognition by Frederick Jelinek, Robert Leroy Mercer, Lalit R. Bahl, and James K. Baker.
Similar Artists