When you are looking at a sequence of related numbers (for example, the price of a single stock over time, the height of all students in a classroom, or how many breakfasts you will get out of a box of a particular cereal), the “variance” is how far away the numbers get from the average. The higher the variance, the farther away most numbers are from the group’s average.
Consider two variables:
The technical definition is as follows:
Consider the random variable X. The variance of X is defined as the estimated value of each element of X, minus the mean, squared.
Var(X) = E[(X – μ)^2]
where μ is the average of all the numbers you’re looking at.
To calculate this, imagine you have a set of 10 numbers:
First, find the average of these 10. In this case, the average is 9.8.
Next, subtract 9.8 from each number in your set:
This is how far each of these numbers are from the mean. Now you have to square each number:
and the average of this set of numbers is your variance
Var(X) = 12.76
Why Do I Take A Square, Then Average?
We already know that the average of this sample is 10, and we subtracted 10 from all the numbers. This means that as our sample size grows (if we were using the example of students in a classroom, imagine if we kept looking at bigger and bigger class sizes), if we don’t take a square, the X – μ will always be very close to 0, since we will have the same amount of numbers above and below 0.
By taking the square, we make sure that all of these “errors” are above 0, and we can compare the variance of many different kinds of samples.
What Is Variance Used For?
Looking at differences in variance is the heart of all technical analysis; it is what stochastics is based on, along with every field in statistics. If you are looking to do any kind of analysis where there is any randomness to data, looking at variance is the only way to find valid conclusions.