Wednesday, September 4, 2013


One of the interesting side benefits of following MLB Network's Brian Kenny's #KillTheWin movement are the other stats people believe should be eliminated. Baseball is rife with numbers that no longer have meaning but have become so entrenched that removing them would require an act of Congress (don't try the White House petition route--it won't work). One often mentioned is the error.

This chart shows the number of errors per game from 1871-2013 (all data through Monday, September 2nd):

The introduction of the glove can almost be identified with precision, right around 1880. Leagues came and went in the late 19th Century (the Union Association in 1884, American Association from 1882-1891 and Players League in 1890), and by the time the National League contracted from 12 to 8 teams after the 1899 season and the American League formed in 1901 only the best players were left and the number of errors per game was reduced to around 2. The error rate continued to decline down to around 1 per game by 1940. This next chart truncates the chart and shows errors from 1950-2013:

The error rate was relatively steady until the mid-1970s when a dramatic decrease in errors occurred, about a 25% decline. Better conditioning, bigger gloves and improved field conditions certainly played a role, and the introduction and subsequent removal of artificial turf may have had an impact as well. No matter what the reasons, there's no disputing the current error rate is the lowest of all time.

The problem with the error is the subjective nature--after all, what IS an error?
1. A ball is hit to short and the shortstop goes 6 feet to his right, makes the grab but snaps off an errant throw--error or unsuccessful effort?
2. A ball is hit to first and the first baseman who will not be named can't be troubled to move more than one foot to his right and watches the ball roll into right field--error or lazy play?
I suspect 99 of 100 fans would call the first situation not an error and think the second should be even though the first basemen didn't put a glove on the ball. That's what drives people crazy, the capricious nature in which the errors are charged--in certain cases, effort is penalized and lack thereof overlooked.

I see it every time I update my play-by-play database. I enter data daily to measure things like how often a batter actually bats in his batting order slot (quick answer--not as often as you think), how well players do with runners in scoring position and the like. I have numerous double-checks to make sure the data is correct and check it against Baseball-Reference data once a week or so to correct my mistakes. Inevitably there are changes in the official scoring in which hits are turned into errors and vice versa, and this happens after the fact, sometimes as long as a week after the play was made. This is a valid measure?

Change IS coming--just as PITCHf/x has upended the way people view pitching and spawned entire sites like Brooks, FIELDf/x is already out there, the only question being how long before average Joes like me get to access it. I'm no John Dewan and don't ever plan on becoming one but once advanced fielding data becomes mainstream and the TRUE range of fielders is understood, it will be revolutionary. In the end, which play is more dramatic?
1. A left fielder runs 25 feet to his left to make a diving grab on a screaming line drive
2. A left fielder stands and moves not one millimeter to make a routine fly out
It's a subjective question, but as baseball moves forward the left fielder will be shifted around depending on hitter, pitcher AND pitch being thrown. It already happens today, but what happens when Big Data comes through and illustrates tendencies? Hitting spray charts are already out there and a shift of 3 feet in one direction or another, especially in the outfield, can be the difference in not just a hit or an out, but an extra-base hit or an out. 

When that point comes, new metrics will be developed (they probably already are) along the lines of effective range--examples:
1. Just how far to a player's left, right, forward and back can a player really go
2. Effective range percent (what percent of balls that are 10 feet or more are turned into outs)
3. Positioning percent (how often was the player perfectly placed to make an out)
There are others I can't even imagine. It happens in every other industry in the world--old measures are used until new and better ones are developed. Much as the win is an anachronism from a bygone era, the error will soon join it--it's simply a matter of how long and how hard the baseball "purists" will fight the inevitable.

No comments :

Post a Comment