MyPivots
ForumDaily Notes
Dictionary
Sign In

Degrees of Freedom

In statistics, the degrees of freedom (df) are the number of values in the final calculation of a statistic that are free to vary. In other words, the number of independent pieces of information that are used to calculate the statistic. The degrees of freedom are important in statistical inference because they determine the distribution of the test statistic and, therefore, the probability of making a Type I error.

The degrees of freedom for a given statistic can be calculated in a number of ways. For example, the degrees of freedom for the t-statistic are the number of observations minus the number of parameters estimated in the model. The degrees of freedom for the chi-square statistic are the number of cells in the contingency table minus the number of independent constraints.

The degrees of freedom are important because they determine the shape of the sampling distribution of the test statistic. For example, the t-distribution is a bell-shaped curve with a mean of 0 and a standard deviation of 1. The t-distribution is symmetric, but it is more spread out than the normal distribution. This is because the t-distribution has fewer degrees of freedom than the normal distribution.

The degrees of freedom also determine the probability of making a Type I error. A Type I error occurs when the null hypothesis is rejected when it is true. The probability of making a Type I error is called the significance level. The significance level is often set at 0.05, which means that there is a 5% chance of making a Type I error.

The degrees of freedom are important in statistical inference because they determine the distribution of the test statistic and, therefore, the probability of making a Type I error.