Notice: Trying to get property 'display_name' of non-object in /var/www/html/wp-content/plugins/wordpress-seo/src/generators/schema/article.php on line 52
keyboard_arrow_uptop

At feeds, signings, and any other meeting I happen to attend, it is clear that Voros McCracken's observations on pitching and defense still generate intense disbelief from many, if not most, baseball fans.

First of all, let us be clear on what Voros actually said. He initially claimed that pitchers have no control over whether balls in play turn into hits or outs; after more work, he refined his claim to say that the differences between major league pitchers is small, much smaller than commonly believed, and small enough to be insignificant information. It is convenient shorthand–an exaggeration, if you will–to continue with the original idea of there being no differences, and that the differences we do see can be attributed to luck. Here at BP, we'll describe a pitcher as being "hit-lucky," for instance; admittedly, from the data, you would have a hard time showing that it isn't luck.

Quite a few people have challenged Voros' original article and subsequent follow-ups, with the best one probably being Tom Tippett's. However, even Tom's article still shows that most pitchers, indeed, have almost no effect. This still runs so contrary to most people's expectations that some more explanation is in order.

First, I have to say that most people are interpreting the proposition in light of their own direct playing experience–which, for the vast majority of us, does not extend beyond Little League or perhaps high school baseball. They know pitchers have some ability to be harder to hit because they have experienced it themselves, and no statistics can convince them otherwise. The trick here is to understand that major league baseball is a different game from what you played in high school. Some of the things we talk about on BP–a strikeout is no worse than any other out, for example–are not universal baseball truths, but are only true when the skill levels involved are at or near major league levels. A strikeout is worse than other outs because it doesn't advance a runner and doesn't give the other team as much of a chance to make an error; it is better than other outs because it doesn't give the other team a chance to make a double play. You have to get somewhere in the high minors before the ability of the defense is high enough, both to turn a double play and to not make errors, to tip the scales from strikeouts being a really bad event to being no worse than other outs.

Hits per ball in play may be another one of those truths. Suppose that there is a clear ability to make batters hit a ball weakly, and that teams can recognize it; clearly, this would be a valuable ability for a pitcher to have. Other things being equal, it would give a pitcher an advantage, like height in the NBA. Assuming that teams can recognize it and select for it, you would produce a major league where the selected population is better than its selection group–just as NBA teams are taller, on average, than NCAA teams (its principal recruiting pool), major league pitchers should be better than minor league pitchers, and you should be able to demonstrate a weeding out of the less able. Reaching the major leagues is a sensational example of Darwinian survival.

So: do they? The first part is easy to answer–do major league pitchers give up hits (per ball in play) at a lower rate than minor leaguers? And, ideally, does it go down in a steady progression with minor league level? At first glance, the answer is, "sort of."

All data from 1996-2004. BABIP = (H-HR)/(2.75*IP+H-HR-SO)

 

Major leagues  .309
Triple-A       .322 (International, PCL, AA)
Double-A       .318 (Eastern, Southern, Texas)
High-A         .323 (California, Carolina, Florida)
Mid-A          .321 (Midwest, SAL)
Low-A          .327 (New York-Penn, Northwest)
Rookie         .348 (Pioneer, Appalachian)

The majors are lowest, the rookie leagues are highest, but the rest is muddled. It turns out, though, that the muddling is likely to be the result of another effect on balls in play, namely altitude. As we know from the Rockies, averages go up in thin air. If we cut the list of leagues down so that we're only talking about those who play at essentially sea level–losing the PCL, Texas, California, Northwest, and Pioneer leagues–we get the following modified list of BABIPs:

 

Majors   .309
IL+AA    .314
Eas+Sou  .316
Caro+FSL .316
SAL+ML   .321
NY-P     .323
App      .340

That, folks, is a smooth progression. Major league pitchers do allow fewer hits per ball in play than minor league pitchers. However, that doesn't prove the pitchers are responsible–it could simply be that the fielders are improving, not to mention the fields; smooth fields make fielding easier. We need to see if the pitchers themselves are being selected for this property.

To do that, I looked at every minor league from 1996 to 2000, and I simply divided the pitchers into two groups: those who have played in the major leagues sometime through 2004, and those who haven't. One would expect that the major league group would be better than the non-major group. Not 100% of the time; there's always a Ryan Anderson type who, despite being one of the best pitchers in his league, blows out his arm before pitching in the majors, and there are pitchers like Jorge Julio who were lousy minor league starters and only made it to the majors after switching to the bullpen. The only massaging I'm doing to the numbers is to normalize them to a league with 9 hits, 1 home run, 3 walks, 6 strikeouts, and 4.5 runs per nine innings, so that I can add the various leagues together; just think of 6.6 strikeouts not as 6.6 literal strikeouts, but 10% above average, and you'll be fine. I set the relative BABIP figure to a nice round .300.

Triple-A

              IP     H     R    HR    BB    SO   BABIP
Majors     150067  8.88  4.36  0.97  2.92  6.10  .299
Non-majors  30387  9.60  5.21  1.13  3.39  5.52  .307

83% of the innings pitched in AAA were by pitchers who either had pitched in the majors before or would pitch in the majors later. Because it is the major league group who dominates the league, their numbers are close to the league average, while the non-major leaguers show up as substantially worse pitchers overall. As expected, they give up more runs, more homers, more walks, strike out fewer–and they give up more hits per ball in play.

Double-A

              IP     H     R    HR    BB    SO   BABIP
Majors      81760  8.68  4.20   .94  2.81  6.30  .296
Non-majors  93872  9.28  4.76  1.05  3.16  5.74  .303

The major-league percentage of innings drops to 47% at AA, and now the two sides roughly straddle the means. Again, the major league pitchers are superior at all phases of performance.

High-A

              IP     H     R    HR    BB    SO   BABIP
Majors      57441  8.55  4.05   .91  2.69  6.49  .296
Non-majors 144201  9.19  4.69  1.04  3.13  5.79  .302

The major league percentage is now only 28%, but the patterns are the same as before.

Middle-A

              IP     H     R    HR    BB    SO   BABIP
Majors      42045  8.64  4.15   .94  2.74  6.34  .297
Non-majors 130670  9.12  4.62  1.02  3.09  5.88  .301

Percentage down to 24%. BABIP advantage still present, but weaker.

Low-A

              IP     H     R    HR    BB    SO   BABIP
Majors      11355  8.38  3.97   .90  2.73  6.44  .292
Non-majors  63029  9.11  4.60  1.02  3.05  5.92  .301

Rookie

              IP     H     R    HR    BB    SO   BABIP
Majors       6466  8.24  3.82   .84  2.58  6.53  .291
Non-majors  47849  9.10  4.59  1.02  3.06  5.93  .301

The major league-percentages for the short-season leagues are down to 15% and 12% of the total; those will probably go up, as more players from 2000 graduate to the majors in years to come (overall, these four leagues had a graduation rate of 16% in 1996 and 1997, 15% from 1998 and 1999, but only 7.5% from 2000). That means the difference is probably going to get a little narrower, since presumably only the very best pitchers were the ones who rocketed through the system. The difference between BABIP for the two groups is larger than for any other leagues.

Overall, the results are clear. The pitchers who made the major leagues are, not surprisingly, better than their counterparts who did not, by every measure of pitching you may desire–including giving up fewer hits per ball in play. Looking at the data for all 72 leagues, there were six leagues where the non-major pitchers had a better BABIP than the major league pitchers, just as there were six leagues where the non-majors allowed fewer home runs. Strikeouts broke "wrong" once; walks never did. The margins were not as large–the major league pitchers were typically 10-15% better in home runs, walks, and strikeouts, but only about 3% better in BABIP–but they were present and consistent. Just as Tom Tippett concluded, based on looking at pitchers by the length of their major league careers, one has to say that BABIP looks like just as much of a skill as home run, walk, or strikeout rates.

One other thing. Looking at the league figures, there was a correlation of .64 between BABIP and runs allowed–a much stronger correlation than either walks or strikeouts. There was, not surprisingly, a moderate correlation with home runs allowed, at .42–the biggest surprise may be that it isn't stronger. Pitchers who give up more BABIP have a strong tendency to give up more runs, something that any talent evaluator is going to notice pretty quickly, which makes it more likely to be a factor in their promotion or release.

Thank you for reading

This is a free article. If you enjoyed it, consider subscribing to Baseball Prospectus. Subscriptions support ongoing public baseball research and analysis in an increasingly proprietary environment.

Subscribe now
You need to be logged in to comment. Login or Subscribe