# How To Find Range?

The definition of range in computer programming is the variable values include the upper and lower range of values in an array. In statistics, the range can be defined as the interval between the maximum and minimum values of a dataset. It shows how the values are spread out and how far apart they are spread.

To Calculate the Range:
First, find the range of the set. To do that, list out all the numbers in the dataset so that you can find the highest and the lowest numbers. For example, if the dataset has the following numbers 25, 46, 19, 34, 45 and 14 then write them down in ascending order so that it becomes easy to identify the lowest and the highest number apart from being able to find the mean, median and mode of the dataset.
14, 19, 25, 34, 45 and 46
In this example 14 is the lowest number, and 46 is the highest.

Find the difference between the highest and the lowest number in the dataset once you have found them in the data points.
Subtracting the highest and the lowest number from the above example:
46 -14 = 32 which is the range of the set.

Need for a Range in Statistics:
In statistics, it is essential to measure the spread of data or variability. Standard deviation and the range are the two measures of variability and the simplest way to calculate it is through the range. The main reason that the spread of data has to be calculated is to be able to relate it to the central tendency. It gives a better idea of the mean and how it represents the data. If the spread of data is big, then the mean does not give a clear picture of the data spread as it should be. Large spread means that there are large differences in datasets.

We can also use the range rule rather than using the tough formula to find the standard deviation.
The range can also be used in a box and whiskers plot too. The range can be calculated by plotting the maximum and minimum values on the whiskers of a graph. The range is the total length of the whiskers.