Entropy Calculator
In information theory, entropy is a measure of the uncertainty associated
with a random variable. In this context, the term usually refers to the
Shannon entropy, which quantifies the expected value of the information
% contained in a message, usually in units such as bits. In this context, a
'message' means a specific realization of the random variable.
Shannon denoted the entropy H of a discrete random variable X with possible
values {x1, ..., xn} as,
H(X) = E(I(X)).
E is the expected value,
I is the information content of X
Example usage
-------------
in = [.25 .25 .25 .25];
b = 'bit';
Entropy = info_entropy (in, b)
Citation pour cette source
Vallabha Hampiholi (2024). Entropy Calculator (https://www.mathworks.com/matlabcentral/fileexchange/35611-entropy-calculator), MATLAB Central File Exchange. Récupéré le .
Compatibilité avec les versions de MATLAB
Plateformes compatibles
Windows macOS LinuxCatégories
- Wireless Communications > Communications Toolbox > PHY Components > Error Detection and Correction >
Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Découvrir Live Editor
Créez des scripts avec du code, des résultats et du texte formaté dans un même document exécutable.
Version | Publié le | Notes de version | |
---|---|---|---|
1.1.0.0 | It now handles 0 as entry in the probability array. |
||
1.0.0.0 |