Mutual information is introduced as a measure for dependence between random variables that is based on entropy. It is defined as the difference between the joint entropy of two variables and the sum of their individual entropies. Mutual information has various applications including association measures between genomic features and outcomes, using mutual information for distance measures in clustering to detect epistatic interactions, and constructing outcome-guided mutual information networks for prediction. Challenges with mutual information include handling noise in continuous data and assessing statistical significance while accounting for multiple testing.