The entropy \(H\) of a source is equal to the expected (i.e. average) information content of its messages:
\(\displaystyle H=\sum\limits^{N}_{i=1}p_{i}k_{i}=\sum\limits^{N}_{i=1}p_{i}\log_{2}(1/p_{i})\)
status | not read | reprioritisations | ||
---|---|---|---|---|
last reprioritisation on | suggested re-reading day | |||
started reading on | finished reading on |