If you’ve been around Baseball Prospectus for more than a few minutes, you’ve inevitably heard someone talk about strikeouts and strikeout rate. Strikeouts are an odd bird; there’s loads of evidence that indicates that, if you had to pick only one metric, strikeout rate–or something encompassing it, like K/BB–is it for pitchers. Nothing’s more important. With hitters they seem to be almost completely meaningless, to the point where there’s virtually no correlation between team strikeout rate and run scoring. In fact, there’s a positive correlation between strikeouts and power, so if strikeouts are great for pitchers, they’re not that bad for hitters either. Everybody wins, so maybe things would be great if hitters just struck out every time they got out.

When you talk to people about strikeouts and their place in batting performance, the retort to the idea that they’re just another out is that putting the ball in play puts pressure on the defense and increases the chance that runners can advance. If you take two ballplayers with exactly the same statistics except that one strikes out every time and the other gets balls in play every time, the idea is that those two batters produce essentially the exact same offense. Now, jokes about Productive Outs aside, that’s not entirely true. It’s just that there are no two players who are that extreme–there certainly aren’t any teams that extreme–and the effect of putting the ball in play can be lost in the noise.

No matter how long you argue, the fact of the matter is that getting out on a ball in play is different than striking out. Getting back to our two hypothetical players–let’s call them Sam Slappy and Winston Whiff (hey, it’s late and I’m tired)–we can estimate the costs and benefits of their different approaches at the plate.

The first difference is that Slappy is going to get on base a few more times each season by virtue of reaching on an error, an event that actually counts against on-base percentage. Last year, 1800 batters reached base on an error. There were 94,118 fieldable balls last year–I’m removing hits since our two batters have the exact same stats–so approximately 98.1% of all “outs” are true outs while in 1.9% of the cases the batter reached on an error. There’s a slight issue with double plays, but that will come back in a minute.

Assuming that both Slappy and Whiff notch 600 plate appearances and they put up nice league average stats (.260/.330/.440), that leaves right around 400 PA in which Slappy put the ball in play and Whiff did not. Looking at those numbers above, Slappy would reach base nearly eight more times over the course of the season, not an insignificant amount. Counting those ROEs as “hits,” Slappy now puts up a .273/.343/.453 line for an MLVr of .0511 instead of .0000, adding up to about seven runs over the course of the season or about 0.0175 runs per strikeout.

Adding baserunners makes things a little more complicated. The most obvious situation where a strikeout takes away from the offense is a man on third with fewer than two outs, but players also have a chance to advance from other bases on balls in play when the batter fails to reach base.

In 2004, there were just over 25,000 plate appearances in which the batter came to the plate with fewer than two outs and men on and did not strike out and did not reach base. In those instances, baserunners accumulated 7,564 extra bases. Breaking those bases down by the baserunners’ locations, it’s possible to estimate how many extra runs those bases are worth.

Using the Run Expectation Table for 2004, a runner on first with one out resulted in an average of .5496 runs scored. If that runner advanced to second, the expectation goes up to .7104. The extra base was worth .1608 runs. Adding up all situations where a runner can advance from first to second and weighting them by the frequency of their occurrence, we can get a rough estimate of how much each extra base is worth. While some bases are clearly worth more than others depending on the situation, situational hitting is nearly impossible to divine from the data. In 2004, going from first to second was worth right around .250 runs, second to third worth just over a third of a run (.344), and third to home worth almost exactly a third of a run (.333). While it may seem that going from third to home should always be worth exactly one run, if the runner remained at third, there’s a good chance he would have scored later in the inning.

Given those approximate values for bases, we can then go back to the number of times runners advanced on outs in play. Of those 7,564 extra bases gained, 1,178 were from first, 4,158 were from second, and 2,228 were from third. Multiplying these bases by the value of the extra base, approximately 2,466 runs were added over those 25,000+ plate appearances. That comes out to 0.0978 runs per PA. Baserunners occasionally advance on a strikeout, adding 0.0033 runs per PA, so that will be subtracted from the 0.0978, yielding 0.0945. Remember that those are only plate appearances where there are runners on and there are fewer than two outs. These PAs make up only about 25% of all outs in the field, so instead of nearly a tenth of a run, the value of putting the ball in play is reduced to 0.0257 per PA when looking at advancing baserunners.

Thus far we’ve added 0.0175 runs per PA for outs in play because of errors in the field to 0.0257 runs per PA for outs in play due to advancing runners. Initially, it would appear that each strikeout costs the team 0.0364 runs over an out in the field. However, there’s nothing like a nice rally-killing GIDP to make fans wish their favorite player would have just struck out instead. Looking at the Double Play Rate for Batters, 12.6% of double play opportunities are converted into a twin killing. Using a similar process as with the baserunners above, a cost of the extra out and removed baserunner is around -0.0136 runs per PA.

Breaking out the old TI-85: 0.0175 + 0.0257-0.0136 = 0.0296. On a very rough scale, a strikeout costs a team about three one-hundredth of a run. Looking at team totals from 2004, Reds batters led the league in strikeouts with 1,335 while the Giants trailed with 874, a difference of 461 whiffs. All those failures at the plate cost the Reds an estimated 13.6 runs over the course of the season, or just over one win. With individual batters who accumulated at least 600 plate appearances, **Adam Dunn** led the league with a well-publicized 195 strikeouts while **Juan Pierre** trailed with a mere 35. The 160 strikeouts–the most extreme case in the majors–add up to a difference of 4.7 runs.

Strikeouts do have a marginal cost when it comes to offense. The problem with evaluating them is that the price is so marginal that even between the most extreme teams and players, the difference is negligible. While the Reds would certainly like to have those 13.6 runs they “lost” by striking out so many times, the strikeouts come part and parcel with the kind of players the Reds have, the kind of players that are case studies for the positive relationship between isolated power and strikeout rate. Everyone can agree that strikeouts are not ideal, but like *“Honey,”* they may make you cringe, but if you just focus on the other positives, the marginal costs become just that: marginal.