Skip to content

EntropyFunctions

Clifford Bohm edited this page Jan 24, 2021 · 4 revisions

The entropy functions provide the low-level tools for calculating Shannon entropy. They rely on TimeSeries.

Entropytools exist in the ENT:: namespace. To call them use:

ENT::Entropy(X);



Entropy Functions

double Entropy(const TS::intTimeSeries& X);<BR>

return the entropy of X

double MutualEntropy(const TS::intTimeSeries& X, const TS::intTimeSeries& Y);<BR>

return the mutual entropy of X and Y. Note that Mutual Entroy, Shared Entropy and Information are all the same thing
Entropy(X) + Entropy(Y)) - Entropy(TS::Join(X, Y)

double ConditionalEntropy(const TS::intTimeSeries& X, const TS::intTimeSeries& Y);<BR>

return the conditional entropy of X and Y; that is the entropy of X after the entropy explained by Y is removed
Entropy(X) - MutualEntropy(X, Y)

double ConditionalMutualEntropy(const TS::intTimeSeries& X, const TS::intTimeSeries& Y, const TS::intTimeSeries& Z);<BR>

return the conditional Mutual entropy; that is the information shared by X and Y after the entropy explained by Z is removed
Entropy(TS::Join(X, Z)) + Entropy(TS::Join(Y, Z)) - (Entropy(Z) + Entropy(TS::Join({ X, Y, Z })))

Clone this wiki locally