June 6, 2017
Fire Up the Time Machine
The Effectively Wild Facebook group can be a fun place. Last week, member Adam Dyck posed a fun question. “How far back in time would you have to send a team of modern-day replacement-level players before they would be the greatest baseball team on the planet?” For those who aren’t regular listeners to the show, it’s a quintessential example of the sort of question that gets bantered about. It’s one part baseball, one part science-fiction script idea. It’s un-answerable, but it hints at a larger question that’s worth discussing. How has baseball changed over its history?
In the field of psychometry, there’s a concept known as “the Flynn Effect,” named after researcher James Flynn, who noted that over several decades, scores on standardized intelligence tests have crept slowly upward. It’s not entirely clear why this is. It’s possible that humans are getting smarter (if you believe that intelligence tests actually measure smarts). It’s also possible that humans have gotten better about designing intelligence tests that flatter themselves more thoroughly. (Comedian Emo Philips is noted for remarking: “I used to think that the brain was the most wonderful organ in my body. Then I realized who was telling me this.”)
It’s not unreasonable to think that humans are better off in some ways than they once were. Life expectancy has increased. Technology has improved life at least in some ways. (You are reading this on a screen and it was magically delivered to you via a set of wires.) Perhaps the pitchers and hitters of today, when compared to each other, aren’t any better or worse than pitchers and hitters of the past, but what would happen if we introduced a time machine into the equation? If a team of hitters from the present day were sent to 1927, would they keep Babe Ruth company at the top of the home run leaderboard? Would they blow past him?
I somehow doubt that a team of players who are replacement level today—a team that, depending on who you talk to, would win somewhere around 40 or 50 games out of 162—could raise that projection by the 50 or so games it would take to be the undisputed “best team in baseball,” but I would frame the question in a slightly different way. How far back might we need to send a team of average players before they became the team that everyone feared?
Warning! Gory Mathematical Details Ahead!
This question has a bit of a hitch in it, and it has nothing to do with temporal paradoxes. It’s widely known that the past few years have seen a spike in home run rate. Is this spike the result of batters getting better? Is it that pitchers are getting worse? Is there some unseen other cause? The root cause is going to be important for this question. For example, if the cause of the spike was that all teams moved into stadiums that had fences 250 feet away, then transporting our players back into an era where they had real fences wouldn’t make them better home run hitters in that era.
But more than that, baseball’s double-accounting system makes these sort of cross-era comparisons difficult. A home run hit by the batter is a home run given up by the pitcher. We could say that our time-traveling hitters would hit more home runs in the past, but would the pitching staff arriving from the bullpen in a DeLorean be more likely to give up a bomb? We’re going to have to tread very carefully.
So let’s set up some parameters for answering the question. We assume that we’re taking an average team back to the past. They will set down in whatever year we program the time machine for and quietly infiltrate and replace one of the already existing teams in the league. No one will notice or think that it’s weird. In other words, this is the baseball video game model where for some reason the 2017 Yankees have replaced the 1927 Yankees. Or vice versa.
I believe that the key question that we really need to answer is what would they be allowed to take with them? This question is often asked in the reverse form of “how would Babe Ruth perform if magically transported from 1927 into 2017?” versus “how would Babe Ruth perform if he had been born in 1982, benefited from all of modern medicine and science as he was growing up, and then played in 2017 as a 32-year-old (as he was in 1927)?” It’s a way of asking the question of how much modern science and medicine really matter. It’s an unanswerable question, which makes it interesting to discuss.
They Can Take Their Physical Bodies with Them
What if our time-traveling team could only take their physical bodies with them? If we set them down in 1977, they would believe that they had always lived in the world of disco and that Jimmy Carter had always been president and that wearing polyester uniforms was a good idea. One thing our time travelers would notice is that they were much bigger than most of the other players in the league.
This is a graph of the median (50th percentile), 70th percentile, and 90th percentile Body-Mass Index (BMI) of all players who appeared in the majors from 1900-2016, based on their “listed” height and weight in the Lahman database. This assumes that players kept the same weight through their careers and that teams were telling the truth, which are both bogus assumptions, but the aggregate should wash away a lot of those sins.
We see that in the mid-90s, something (*cough*something*cough*) happened that caused an inflection point in MLB. After most of a century of the same body types, players started getting bigger. Mostly, they got heavier, although players today are also taller than they had been. The median player in MLB right now would be larger (in terms of BMI) than 90 percent of players who played in any year before the 1990s. To put that another way, while there were certain big guys playing in the 1960s and 1970s, it would seem that our temporally transported team had half their roster that were as big or bigger than the biggest guys on any of the other teams.
That size came with a cost. Here’s a graph over time from 1950-2016 of triples / (doubles + triples), again league wide.
Here’s another graph of the rate at which baserunners took an “extra” base on a hit, such as going from first to third on a single, or first to home on a double.
As time has gone along, baseball players have gotten slower. (Or is it that they’ve gotten less aggressive? Or the parks have gotten smaller? Or outfield arms are just more dangerous now?) There’s also evidence that players have become poorer defenders over the years, potentially because teams have prioritized size and power over mobility. Here’s a graph of league-wide BABIP since 1950
There’s that mid-90s inflection point again. Prior to 1992, BABIP league-wide had been consistently in the high .280s. In 1992, it was .285. In 1993, it jumped to .294, and then again in 1994 to .300, and it has been stuck around .300 ever since. (No, it is not a Coors Field effect. I checked.) So, it seems that the mid-90s, which some have referred to by another name (*cough*something*cough*), ushered in a radically different era in baseball. Players are less mobile than they used to be. But, my oh my, can they hit the ball further now.
Here’s a graph, again since 1950, of home runs per fly ball*. I put an asterisk next to fly balls, because it’s not really fly balls. We don’t have batted ball-type data back into the 50s, so I improvised. That’s actually HR / (HR + 2B + 3B + air outs caught by outfielders).
We can see that there was yet another inflection point in the 1990s. I ran a similar set of analyses for doubles and triples [ (2B + 3B) / (HR + 2B + 3B + OF air outs) ] and got a very similar graph (not shown). The bigger hitters that invaded the game in the 1990s ushered in an era of extra-base hits.
We know from modern research that while batters and pitchers combine to create a fly ball, once the ball is in the air, how hard it is hit and how far it goes and whether it goes over the fence is largely a function of the batter’s talent. So, we can credit the increase in bigger fly balls mostly to the batters and likely to their increased size.
In 2016, the average MLB team had 304 doubles and/or triples, had 462 opportunities for extra advancement on hits, and faced 4,107 balls in play that they needed to field. We’ve established that in these cases, the team of today is worse than teams of old. However, they do hit more home runs and extra-base hits on their fly balls. How many runs does that actually buy (or lose) them over what players used to be able to do?
(Note: I’m using some very quick-and-dirty eyeball estimates for relative (dis)advantage and some “good enough” thumbnail run values for these events.)
If a team of (smaller) players from yesteryear showed up in today’s environment, they would be better defenders, and they’d get some value from their better baserunning abilities, but they’d lose a lot of value from their relative lack of ability to hit for as much power as the current players, leading to a two- or three-win (again, I’m using some very slapdash math) advantage for our modern-day monsters.
But hang on a second. In the table above, we’re talking about the differences between a team of old-school players (smaller, but more mobile) versus today’s bigger players as if they were playing the same style of baseball played in 2016. Today, there are simply aren’t as many balls hit into play. A modern team going backward in time would face off against a team that was more likely to put the ball into play, and therefore, our less mobile modern-day players and their somewhat questionable defensive abilities would be forced to make more plays in the field, and probably have more balls skip past them.
Let me re-populate that table, but use the rates from 1976.
They Can Take Modern Roster Construction with Them (and Velocity)
We know that prior to the late 1980s, the single-inning reliever wasn’t “a thing.” Up until 1992, the plurality of relief appearances lasted more than one inning. Once teams realized that the one-inning model worked and could cover most of their bullpen needs, there was a shift toward velocity in the game, especially in the bullpen. If a pitcher only needed to get three outs, he could air it out during that inning. More to the point, if there was a pitcher kicking around who had electric stuff that would short out after an inning or so, he finally had someone offering him a job.
Data is only available back to 2002 on velocity, but pitchers now throw about four mph faster than they did 15 years ago. There were guys who threw 95 back in the 1980s, and they were a rare find. Now, each team has a few. It’s interesting to wonder if those guys who could hit 95 were always there, but since there was no job description for a guy who could only throw one inning at a time, they were simply passed over.
Our time-traveling team would be going back to a time in which they wouldn’t face quite as much velocity as they do now. Their opponents would constantly be facing what they had previously known as “the hardest-throwing guy in the league.” It’s hard to model exactly what would happen if hitters from the 1960s or 1970s were constantly facing off against modern-day velocity. We can only guess that their performance would suffer, but what would happen to our present-day hitters as they entered a world where a 92 mph fastball was something worth noting?
I took all fastballs from 2016 and binned them into one-mph segments. Here’s how often players (in 2016) made contact when they swung.
Dropping four miles per hour, at least on fastballs, means that hitters had a four or five percent greater chance of making contact with the ball. It’s worth noting that over the past 15 years, league-wide contact rate and league-wide strikeout rate have correlated at -.71 (more contact leads to fewer strikeouts for obvious reasons), and every additional percentage point of contact rate lowers the strikeout rate by 1.25 percentage points. If you multiply all of that together, it’s a five percentage point drop in the strikeout rate.
In 2016, hitters struck out 21.1 percent of the time, and a drop to 16.1 percent would bring things back to strikeout levels last seen in the mid-90s. Go back further and our time travelers would still strike out more than the average team, though not by as much. (This also assumes that velocity would be the only thing driving making contact and that contact is the only thing driving the strikeout rate.)
Hitters might make better contact off of lesser velocity, but does that mean getting more out of the balls that they do hit? Here’s slugging percentage on contact, again, paneled by velocity bins for 2016 hitters, fastballs only.
Well then ...
A drop of four mph in average fastball speed seems to be worth something like 40 points worth of slugging! Again, there may be other factors at work here, but as some point of comparison, in 2016 the league slugging percentage was .417. A jump of half of that (20 points) would put SLG level with the highest recorded league slugging percentage ever (.437 in 2000), when the average team scored in excess of five runs per game.
We are setting up a world where our modern-day players aren’t going to strike out as much as they do now, and are probably going to be able to hit the ball more effectively, and we haven’t yet accounted for what the effects of all that big-time velocity would be against the hitters of the 1960s and 1970s. Back then, competing against their own contemporaries, teams put up a mere 4.2 or 4.3 runs per game. Even assuming that ye olde offenses would see no ill effects of the bigger, harder-throwing modern-day pitchers, a team that was capable of putting up 5.0 runs per game playing against a bunch of teams that averaged 4.3 runs per game would have a Pythagorean record of 93-69.
Maybe the velocity of the modern game wouldn’t play out exactly that way after the time machine landed, but clearly it’s going to make a huge difference. This alone could turn an 81-win team into a surefire playoff team.
They Can Take Modern Medicine With Them?
This one is harder to do since we don’t have reliable historical injury or disabled list data. It seems like this one would be a (pardon the mixed metaphor) slam dunk for the present-day players, but proving it is harder. Life expectancy in general has risen in the United States over the past century, and I guess that’s as sure a sign of medical advancement as any. But does that filter down into baseball?
In the absence of disabled list data, I tried a couple of workarounds, including a trick I’ve used previously where I look for pitchers who go more than 21 days between major-league appearances. This is a very rough proxy for injury, because the reason for the time off might have been that a pitcher lost effectiveness and was sent to the minors for a few weeks. It might also just be strategic roster churn, but here’s a graph from 1950-2016 on how often those “mysterious disappearances” happened per team. (The spike in the middle is 1981 when there was a work stoppage in the middle of the season and everyone took a couple of months off.)
This graph is notable for the fact that it’s not pointing downward. That’s not necessarily evidence that there are now more injuries (it could just be more roster moves), and even if it shows more “injuries” it might be that modern medical technology just allows teams to see injuries that previously would have gone un-diagnosed.
I also looked to see whether modern medicine allowed players to cheat “death” in baseball. I looked for pitchers who were on the wrong side of 30, and had posted a two-win season. I then looked to see how many more years they had left until they had their final two-win season. Perhaps modern medicine might keep older players going longer? (I stopped the sample in 2006, because we don’t know what’s going to become of the “still pretty good” players over 30 from last year.)
We see that there’s plenty of noise in this one, but that the trend line is generally upward. If you shoot a line through that graph, the regression says that the line is significantly different from zero, so older players are able to stick around longer, but the effect size is about an extra tenth of a year every decade. Score a victory for modern medicine. A very small victory. (The graph for batters has the same form.)
There are probably some places where modern medicine is helping around the edges in baseball, but a torn rotator cuff is a torn rotator cuff, no matter what year it is. I think even if our time travelers are able to take all of their magical medical devices with them, it’s not the huge advantage that we might think it is.
They Can Take ... The World Baseball Classic With Them?
And now what I think is the most interesting graph in the bunch.
This is the percentage of players appearing in each season who were born outside of the United States. We see that it hovers below five percent until the 1950s, around the time when teams began to delete the (horrifically evil) color line that had kept African-Americans out of the game for so long. It grew until it plateaued around the late 80s/early 90s to about 15 percent. In the early 1990s, something happened. Teams started making heavy investments in the international market and began exploring new markets (Cuba, Japan, Brazil). Now, nearly 30 percent of players were born somewhere other than the United States. The existence of the World Baseball Classic is a tribute to that global expansion.
Once again, we see an inflection point in the early 90s. How would this affect our time travelers? Well, let me frame the question in a different way. Suppose that today, in 2017, Major League Baseball were to pass a rule saying that no more than 15 percent of a team’s roster could be composed of players who were born outside the United States. The average team would have to cut roughly 15 percent of its roster, which is 3-4 players.
As insane a proposal as this would be, teams would do the only logical thing that they could, which would be to replace the four players they had to cast off with lesser guys from Triple-A or the waiver wire. In other words, (American) players who were functioning at or below replacement level. (If the players were better than replacement level, they’d probably all be in the majors.) Some teams would be more affected than others, but all of the teams would have less talent than they started with.
Here’s the thing: Going back in time, our average team from today would be entering a world where this rule has effectively been put into place. It’s not that in the early 90s there was some sort of agreement to keep players from outside the United States out of MLB, it’s that those investments hadn’t fully come to fruition yet. It’s possible (and likely) that had MLB invested earlier, we would have seen more players come from other countries earlier.
We don’t really have a way of knowing whether the marginal players that MLB attracted were stars or scrubs. MLB has long had at least some presence in a country like the Dominican Republic, and perhaps the guys whom people thought were future All-Stars made it, while the ones who were left behind might have been roster filler anyway. It’s possible that the investments that teams made drew in players who had talent, but who previously had no way to get in front of MLB scouts.
On average, one roster spot produces 1.3 wins. If we assume that the players who have to be let go were selected at random by the league office, each team would lose something like five wins worth of value (again, on average). If we assume that teams could make their own cuts, they would pick the least talented players and the damage might only be a win or two of value. We have to assume that because teams of today get to take advantage of those global investments of yesteryear, a team going back in time would find an inferior product on the field because they hadn’t been scouring (and scouting) the earth enough to find the very best players in the world.
It’s possible that some of the effects that we saw above can also be tied to this global expansion. Perhaps velocity is up specifically because teams now have access to further reaches of the globe and with a bigger pool of people to draw from, they are simply able to find more humans who are capable of throwing a ball 95 mph or who were big enough to hit home runs, but athletic enough that whatever mistakes they made on the field, they could outhit.
How Far Back?
If there’s a theme that kept coming up over and over again, it’s that baseball changed (for better or worse) starting around the early 1990s. And it changed a lot, at a pace that overwhelmed most of the changes that had come before it in the century before that. The game now is a different one than it was.
There was the obvious (*cough*something*cough*) issue in the 1990s, and that gets all the headlines, but even if we assume that the entirety of the growth in player size was attributable to the growth in the use of steroids, the effects weren’t as large as we might think. Players got bigger and hit more home runs, but they also got slower. It looks like it was a net positive, but there was good and bad to it. What was interesting to me is that the numbers suggest that the global expansion of baseball has done more to affect the game than the expanding biceps of the players.
The revolution that few talk about is the velocity surge. It seems that baseball has been stuck in a quiet battle in which pitchers throw ever faster and the hitters that come into the league are probably being selected for the ability to hit that velocity or for an approach that works against velocity. It seems that once teams got past the idea that one pitcher must endure all nine innings, or at least must go multiple innings, it freed them up to pursue hard stuff.
Pitchers and batters have largely kept pace with each other (with some fits and starts here and there), but a look backward shows how far both sides have come. While they may have played each other to a stalemate, imagining these new hitters being unleashed on a major-league environment that was not ready for it is terrifying.
What’s pretty clear though is that a present-day team of average players wouldn’t have to travel more than a couple of decades into the past before they were the dominant force in Major League Baseball.