The “information” entropy, or Shannon entropy, is used in our research to quantify the degree of dispersion of a probability distribution. In other words, entropy allows us to quantify the level of cell-to-cell heterogeneity in gene expression and the phenotypic heterogeneity in cell populations. Entropy is better than other statistical proxies, such as CV and Fano factor, due to its independence from the mean of the distribution and its applicability to multimodal distribution. According to information theory, the comparative analysis of the entropy from several distributions also enables us to compute the amount of information transmitted from the environment to the cell or from a gene to another in the same cell.