keyboard_arrow_uptop

As long as I’m writing about baseball, there are eight articles you can assume I’ll post each year:

We’re all excited to see baseball, real baseball, return after the five-month, post-World Series hiatus. It’s fun to watch how players and teams have changed in the offseason. Baseball on television or radio assumes its role as the soothing white noise that portends the long, warm, sunny days of summer.

And then a bunch of sportswriters and broadcasters go and ruin it, with breathless “What Have We Learned So Far?” features that’ll start, oh, around Friday or Saturday.

As I wrote last year, April is the least predictive month of the season. July, which is shortened by the All-Star break? Players’ performances in July are better correlated to their full-year results than their records in April. September, when rosters expand and some players are bearing down for the stretch drive while others are clearly playing for next year? Better correlated than April.

This isn’t to say April baseball is irrelevant. It isn’t. Each team plays a couple dozen or so games in April, and those games will count. It’s just that they’re less representative of how things will go for the full season than games in other months. Plus, there are predictable differences between April baseball and May-September baseball.

But that won’t stop the What-Have-We-Learned-So-Far crowd. So, in order to inoculate you from April hype, I’ve compiled a list of April trends that are, probably, false.

I looked at team and league figures for both April and the full season for every year from 1996 to 2006. I chose those years because they encompass the entire interleague play (beginning in 1997) and 30-team (beginning in 1998) eras, and they avoid the 1994 and 1995 strike-shortened seasons. Every team played between 19 (1999 Rockies, 2009 A’s, 2015 White Sox), and 29 (2003 Marlins, 2013 Mariners, eight teams in 2008, and four teams in 2014) April games during those seasons.

Then I calculated how various metrics (e.g., batting average) compared in April to the rest of the season, for both leagues and for clubs. Here are the most durable April trends—examples of how play in April is consistently at variance from the year as a whole—that are most likely to be misidentified.

Pitchers can’t find the strike zone! (Alternative: The strike zone is changing!) Over the past 21 years, the walk rate (BB/PA) has been, on average, 0.5 percent higher in April than during the year as a whole, and it’s been higher in April in every single season since 1996. Over two-thirds of teams (422 of the 626 team seasons since 1996) posted a higher walk rate in April than they did over the full season. Call it pitcher rustiness, call it umpire rustiness, just don’t call it a trend.

Defense is getting worse! You hear complaints that players get rushed to the majors before they learn how to field their positions. Yeah right, that’s why 2016 was the 16th straight season with fewer than 0.7 errors per game, a level that had never been breached in baseball history. But errors become more common in April. Last year, for example, 0.94 batters reached on error per 100 plate appearances in April, compared to 0.89 for the full year. There have been only two seasons of the past 21 in which fewer batters reached on error in April than over the full course of the season, and nearly three-fifths of teams allowed more baserunners via error in April.

There’s less offense! Let me run you through the numbers here. There are a few.

  • April batting averages are, on average, three points lower than full-year batting averages. The April figure has trailed the full-year figure in 18 of 21 seasons and 60 percent of team seasons.
  • April slugging percentages are lower too: Four points lower than full-year, lower in 15 of 21 seasons and 56 percent of teams.
  • Home runs are scarcer. There’s been a home run about every 33 at-bats over the past 21 seasons, but the April rate has been almost a full at-bat higher. Home runs have been less frequent in 15 of the past 21 Aprils, and for 57 percent of teams.

There’s more offense! Even though there are fewer hits and less power (ISO is lower, too) in April, scoring is actually up. No, I wasn’t expecting that, either. The difference is small—about 0.03 runs per game—but still, there were more runs scored per game in April than in the full season in 15 of the past 21 years and for 53 percent of teams. Those extra walks and errors are apparently more damaging than one would have thought.

Batters take control of the plate! As you probably know, there has been a new record for strikeouts per team per game in 11 straight seasons. In April, we may hear about a respite. Don’t buy it. In 14 of the past 21 years, the strikeout rate in April has been lower (if just by a little, an average of 0.14 percent of plate appearances) than for the full season, as has been the case for 54 percent of teams.

Starting pitchers are even wimpier! The difference isn’t large—an average of 0.05 innings per start—but starting pitchers departed games earlier in April than they did over the full year, and this was true in all but five of the past 21 seasons.

Baseball is still dying! Every April, somebody bad at math is going to point out that average attendance during the month is below the prior year’s average, proving that baseball is too old, too slow, too boring, and too unappealing to millennials with their heads buried in their smartphones. Whatever. I don’t have numbers handy, but here are two salient facts about April baseball: It’s cold and kids are still in school. End of story.

So enjoy April baseball, starting with a whole bunch of games today. Be careful about drawing any conclusions from it, though. Especially the ones listed here.

You need to be logged in to comment. Login or Subscribe
dprestonsr
4/03
Great article. Respect the research and "obvious" conclusions from that; but, may I remind you of Simpson's Paradox. If I focus on one seemingly inconsistency; that being runs per game. If home runs are down, wouldn't fewer RPG naturally follow? Perversely, if RPG is up, wouldn't this necessitate fewer innings pitched by SPs. I don't think these conclusions to be circuitous, but maybe a tinge? (I know you were showing the data, as is, without a consideration for any common relatedness or correlation.)d Thanks.
mainsr
4/03
Thanks. Yes, the HR/G and R/G difference really had me puzzled. A few observations: 1. The difference in scoring is pretty minuscule. 2. So is the difference in IP/GS, plus I'm not convinced that difference is due to ineffectiveness as much as it is early-season caution by managers. 3. The difference in BB/PA is pretty large, and I think that accounts for the scoring difference. Simpson's Paradox. New to me. Thanks for pointing that out.
MartnAR
4/05
About the starting pitchers being wimpier in April, I've always wanted to check this. I think the reason SPs throw less innings or are more prone to being pulled from the game is that they're still stretching out and getting a feel for their pitches. If so, then Spring Training is not enough for them to get acclimated. I need to check this but it could be an interesting idea.
mainsr
4/13
Sorry, Martin, I thought I'd replied to this...Yes, I agree that short outings are because pitchers and their coaches and managers are still trying to get comfortable.