From economic inequality and species diversity to power laws and the analysis of multiple trends and trajectories, diversity within systems is a major issue for science. Part of the challenge is measuring it. Shannon entropy H has been used to re-think diversity within probability distributions, based on the notion of information. However, there are two major limitations to Shannon's approach. First, it cannot be used to compare diversity distributions that have different levels of scale. Second, it cannot be used to compare parts of diversity distributions to the whole. To address these limitations, we introduce a re-normalization of probability distributions based on the notion of case-based entropy Cc as a function of the cumulative probability c. Given a probability density p(x), Cc measures the diversity of the distribution up to a cumulative probability of c, by computing the length or support of an equivalent uniform distribution that has the same Shannon information as the conditional distribution of ^pc(x) up to cumulative probability c. We illustrate the utility of our approach by re-normalizing and comparing three well-known energy distributions in physics, namely, the Maxwell-Boltzmann, Bose-Einstein and Fermi-Dirac distributions for energy of sub-atomic particles. The comparison shows that Cc is a vast improvement over H as it provides a scale-free comparison of these diversity distributions and also allows for a comparison between parts of these diversity distributions.