Entropy_vs_Extensio​n

Simulation of Entropy versus the theoretical and simulated average code length
202 Downloads
Updated 24 Mar 2014

View License

Simulation of Entropy versus the theoretical and simulated average code length ,This theorem provides another justification for the definition of entropy
rate—it is the expected number of bits per symbol required to describe
the process.
Finally, we ask what happens to the expected description length if the
code is designed for the wrong distribution. For example, the wrong distribution
may be the best estimate that we can make of the unknown true
distribution. We consider the Shannon code assignment l(x) =

log
designed for the probability mass function q(x). Suppose that the true
probability mass function is p(x). Thus, we will not achieve expected
length L ≈ H(p) =−

p(x) log p(x). We now show that the increase
in expected description length due to the incorrect distribution is the relative
entropy D(p||q). Thus, D(p||q) has a concrete interpretation as the
increase in descriptive complexity due to incorrect information.

Cite As

Abdelrahman Marconi (2024). Entropy_vs_Extension (https://www.mathworks.com/matlabcentral/fileexchange/45986-entropy_vs_extension), MATLAB Central File Exchange. Retrieved .

MATLAB Release Compatibility
Created with R2013b
Compatible with any release
Platform Compatibility
Windows macOS Linux
Categories
Find more on Particle & Nuclear Physics in Help Center and MATLAB Answers

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Version Published Release Notes
1.0.0.0