Abhijeet Mulgund's Personal Webpage

Search

Search IconIcon to open search

Differential Entropy

Last updated Nov 1, 2022

# Definition

Let $X$ be a Continuous Random Variable with Probability Density Function $f_{X} : \mathbb{R} \to \mathbb{R}_{\geq 0}$. Then the Differential Entropy, $\mathbb{H}(X)$ is

$$\mathbb{H}(X) = \mathbb{E}(- \log f_{X}) =- \int\limits_{\mathbb{R}} \log (f_{X}(x)) f_{X}(x) dx $$

# Remarks

  1. We apply the convention that $0 \log 0 = 0$

# Examples

  1. Differential Entropy of Continuous Uniform Random Variable

# Other Outlinks