Entropy_vs_Extension
Simulation of Entropy versus the theoretical and simulated average code length ,This theorem provides another justification for the definition of entropy
rate—it is the expected number of bits per symbol required to describe
the process.
Finally, we ask what happens to the expected description length if the
code is designed for the wrong distribution. For example, the wrong distribution
may be the best estimate that we can make of the unknown true
distribution. We consider the Shannon code assignment l(x) =
log
designed for the probability mass function q(x). Suppose that the true
probability mass function is p(x). Thus, we will not achieve expected
length L ≈ H(p) =−
p(x) log p(x). We now show that the increase
in expected description length due to the incorrect distribution is the relative
entropy D(p||q). Thus, D(p||q) has a concrete interpretation as the
increase in descriptive complexity due to incorrect information.
Cite As
Abdelrahman Marconi (2024). Entropy_vs_Extension (https://www.mathworks.com/matlabcentral/fileexchange/45986-entropy_vs_extension), MATLAB Central File Exchange. Retrieved .
MATLAB Release Compatibility
Platform Compatibility
Windows macOS LinuxCategories
Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
Version | Published | Release Notes | |
---|---|---|---|
1.0.0.0 |