I am studying information theory. When we start learning information theory, we will encounter entropy soon.
We know discrete version, but there is continous one, called differential entropy.
If the mean and variance of the distribution are given, then the distribution that maximize entropy is the Gaussian one.
Entropy is used in data analysis.
https://arxiv.org/pdf/1503.05638
Comment : Entropy looks interesting for me. It is useful technique for various data.