Definition of Variance (Statistics and Mathematics)
At the request of the theory of probability , which is that branch within mathematics that focuses on the study of those random phenomena, that is, those whose outcome is not predictable, the variance of a random variable is a measure of variability or dispersion that will indicate precisely the variation of a distribution , distinguishing through a value if the different scores of a variable are far from the average. The higher the value, the greater the variability, and the lower the value, the more homogeneous the measure will be .
The concept is especially used in statistics where it serves to identify the average of the quadratic deviations of a random variable, taking into account its average value.
The English scientist Sir Ronald Aylmer Fisher, responsible for its coinage
The English scientist Sir Ronald Aylmer Fisher was responsible for the creation of the concept in the first decade of the last century.
It would be in an agricultural station where they would begin their studies and analysis of variance. He had to address the study of a long period of cultivated crops and then in that task the concept arises.
Aylmer has been considered the father of modern statistics and a genius in the field and in many others for the contributions he made in this regard.
He was awarded many distinctions and the important title of Sir.
Variance analyzes are a series of statistical models plus the associated procedures and where the variance will appear in different components.
Generally, an associated one appears in the concept of variance, that of the standard or standard deviation representing the magnitude of dispersion of those ratio and interval variables. This concept also presents an extended use at the request of statistics, within the scope of the descriptive one and to calculate it, the variance must be started and the square root calculation made from it.
Photo: iStock - Aslan Alphan