Find the difference between the actual and estimated data points in a sample. For example, if you have developed an algorithm for predicting stock prices, the difference between the predicted stock price and the actual price would be the error. If your algorithm predicts $12, $15, $20, $22 and $24 as prices for five stocks on a particular day, and the actual prices are $13, $17, $18, $20 and $24, respectively, then the errors are $1 ($13 - $12), $2 ($17 - $15), -$2 ($18 - $20), -$2 ($20 - $22) and zero ($24 - $24), respectively.
Compute the sum of the square of the errors. First, square the differences, and then add them up. Continuing with the example, the sum of the square of the errors is 13 (1 + 4 + 4 + 4 + 0).
Divide the sum of the square of the errors by the number of data points to calculate the mean square error. To conclude the example, the mean square error is equal to 2.6 (13 / 5).