Sum of Squares

Search Dictionary

Definition of 'Sum of Squares'

The sum of squares is a statistical measure of the dispersion of data points around a central point. It is calculated by squaring the difference between each data point and the mean, and then summing those squared differences. The sum of squares is often used in regression analysis to measure the goodness of fit of a linear model to a set of data.

The sum of squares can be used to calculate the variance and standard deviation of a data set. The variance is a measure of the average squared deviation of data points from the mean, and the standard deviation is the square root of the variance.

The sum of squares can also be used to calculate the coefficient of determination, which is a measure of how well a linear model fits a set of data. The coefficient of determination is calculated by dividing the sum of squares of the residuals by the total sum of squares. The closer the coefficient of determination is to 1, the better the linear model fits the data.

The sum of squares is a useful statistical measure that can be used to analyze data and evaluate the fit of a linear model.

Do you have a trading or investing definition for our dictionary? Click the Create Definition link to add your own definition. You will earn 150 bonus reputation points for each definition that is accepted.

Is this definition wrong? Let us know by posting to the forum and we will correct it.