Differential Entropy
# Definition
Let $X$ be a Continuous Random Variable with Probability Density Function $f_{X} : \mathbb{R} \to \mathbb{R}_{\geq 0}$. Then the Differential Entropy, $\mathbb{H}(X)$ is
$$\mathbb{H}(X) = \mathbb{E}(- \log f_{X}) =- \int\limits_{\mathbb{R}} \log (f_{X}(x)) f_{X}(x) dx $$
# Remarks
- We apply the convention that $0 \log 0 = 0$