Monday, April 1, 2013

The Cost of an Error

Not all errors are created equal. What happens when a shortstop boots a ball but the runner doesn’t subsequently score? What happens when a third baseman juuuuuuuuuuuuuust misses reaching that ball? Combing through play-by-play data can tell us if individual errors result in runs, but there’s a simpler, if less-than-perfect method to do it. We’ll begin with defining how many runs translates into a win, which has been well-established by Bill James and others. Simply divide the runs scored in a given year by the wins. For 2012, this would mean 21,017 runs divided by the 2430 wins, giving a value of approximately 8.65 runs equaling a win. This chart shows the historical value from 1901-2012:


The historical value has always been considered to be near 10 runs per win, with a historical value of 8.85. The higher the value, the greater the level of offense, and the lower, the more that pitching is dominant. This works well when viewing data in terms in these terms (all years approximate):
1901-1919                            The Dead Ball Era
1920-1945                            The Lively Ball Era (the dip at the end is due to World War II)
1946-1960                            The No Label Era
1961-1993                            The Expansion Era (with a Mini Dead Ball Era from 1962-1968)
1994-2007                            The “Enhanced Offense” Era
2008-                                     The “Back To Normal” Era

2012 crossed underneath that 112-year average line, and when that occurs, other facets of the game, like base running and defense come to the front. With regard to defense, if we can establish the number of runs equals a win, we can also see how many errors equal a run in a similar fashion, which is simply unearned runs/errors. Before I show that, this next chart shows the percent of runs that were unearned from 1901-2012:
















This is one piece of evidence that defense DOES continue to improve, either through better technique, better positioning, or more likely a combination of that and other factors. Despite the rapid improvement from 1901-1920, from 1920 on the number of unearned runs has decreased by a factor of 2, which is tremendous. This next chart shows the ratio of unearned runs to errors:

















Most of the fluctuation is due to the scale—in this 100+ time span, the value of an error lies between .55 and .65 runs, or in other words, every two errors will lead to a run. Of course, this isn’t game-specific, but it helps see what errors can cost a team.

No comments :

Post a Comment