Notice: Trying to get property 'display_name' of non-object in /var/www/html/wp-content/plugins/wordpress-seo/src/generators/schema/article.php on line 52
keyboard_arrow_uptop

Sixty years ago, America was at war.

That one was very different, and one of those differences was the way baseball reacted. This time around, no one from the major leagues was going to take any part in the fighting, and certainly won’t now that it’s winding down. It is unlikely that anyone from the minor leagues will take any part (if there are any minor league players who are in the Guard and have been called up, I haven’t been able to find any mention of it.)

There are a number of ways to look at how much difference the military service of ballplayers made on the quality of the league at a given time. One of the simpler ways is to compare the aggregate statistics of players coming into the major leagues to the aggregate of the players who were going out. For instance, this line…


                Leaving  Entering
                PA       EQA     PA      EQA
AL 1935-36      3687    .225    4852    .240

…shows that the players who were in the American League in 1935, but not in the major leagues at all in 1936, combined for 3687 plate appearances and hit for a .225 EQA, adjusted for park and league. Big surprise–the players who are getting replaced are below-average hitters (an average major leaguer, for any given season, has an EQA of .260). The players who were in the AL in 1936, but did not play in the majors in 1935, combined for 4852 plate appearances and a .240 EQA. The newcomers played more often and hit better than the ones they replaced.

That was a typical result prior to the war:


                Leaving  Entering
                PA       EQA     PA      EQA
AL 1935-36      3687    .225    4852    .240
AL 1936-37      4350    .231    5941    .241
AL 1937-38      3924    .225    5849    .245
AL 1938-39      3484    .226    7714    .254
AL 1939-40      3475    .225    3977    .236
AL 1940-41      1883    .225    5418    .243

Average         3467    .226    5418    .243

NL 1935-36      5313    .244    5701    .251
NL 1936-37      6607    .232    7520    .243
NL 1937-38      4304    .238    5168    .259
NL 1938-39      4747    .252    6988    .240
NL 1939-40      4265    .240    5613    .235
NL 1940-41      5005    .240    4860    .248

Average         5040    .241    5975    .245

The replacements in the AL always played more, and hit better, than the ones they replaced. The NL story is a little more mixed, as there are a couple of years where the replacements were worse than the ones who left, and one where they had a few more PAs. The NL in the pre-war years appears to have been more aggressive about replacing below-average players, as their de facto replacement level was about 15 points of EQA higher than in the AL–that works out to about .021 runs per out. Consequently, more players were rotated out in the NL (since the bar to stay was higher, more players fell under it). The AL was losing about 7.3% of its plate appearances, and 5.3% of its total equivalent run production, in a typical year, while annual NL losses were 10.8% and 8.8%. Think of that as the baseline attrition level going into war.

Pearl Harbor was bombed during the 1941-42 off-season. While a few players were called to service in 1941 in anticipation of a war (notably Hank Greenberg), there was no general call-up until after Pearl. Even then, it took time build up the forces, and so many players at least played part of the 1942 season before getting called–which means they wouldn’t show up in these stats I’m running, since the player had to be completely absent in the adjacent year. You can see some effect in 1942, though:


AL 1941-42      8536    .259    6777    .246
NL 1941-42      7364    .246    6657    .250

The number of plate appearances lost between ’41 and ’42 is more than double the baseline attrition rate in the AL, and the quality of the players lost was basically league average; the AL lost between 17 and 18% of their PAs and EQR from the year before. The replacements put up typical replacement numbers, perhaps playing a little more often, but they were nowhere near as good as the ones they replaced. The story in the NL was less extreme. Losses were higher than in previous years, but only by about 50%, and the quality of the players lost was not appreciably different than normal. Their replacements actually did a little better.

The story changed by the time the season began in April 1943. Between April 1942 and April 1943 the U.S. Army alone (i.e., not counting the Navy or other service branches) increased in size from 2.7 million men to 6.8 million. More than 300,000 men joined the army every month from July ’42 to April ’43, mostly via the draft, with inductions peaking at over 500,000 in October alone (see http://carlisle-www.army.mil/cgi-bin/usamhi/DL/showdoc.pl?docnum=347, pages 57-58). The playing time lost between ’42 and ’43 reflects that buildup:


AL 1942-43      15347   .266    12636   .256
NL 1942-43      12834   .261    10311   .248


The AL lost 33% of its plate appearances, and 34% of its equivalent runs, as the departing cohort was above average in hitting. Ted Williams and Joe DiMaggio were the two biggest stars to go, but they were far from alone. NL losses once again were not quite as severe, with “only” 28% of their PA and EQR departing. The difference between how well the departing regulars and their replacements hit is deceiving, because in each case we are measuring against a league average of .260–even though the 1943 leagues have taken a huge qualitative hit. I’ll get back to this later, but just keep in mind for now that a .260 in 1943 is probably not as good as a .260 from 1942.

The military continued to induct new personnel throughout 1944 and 1945, but at much reduced rates of only about 100,000 a month. The major leagues continued to see large talent drains from their already depleted base; each year the replacements were worse than those who left, and a feedback loop was driving the competitiveness of the leagues steadily downwards. AL losses continued to be of higher relative quality than NL losses, although NL quantities were larger:


AL 1943-44      10846   .264    8373    .247
NL 1943-44      13794   .260    8791    .245

AL 1944-45      10429   .270    7601    .253
NL 1944-45      11019   .257    8895    .248


The lost plate appearances amounted to 23% and 22% in the AL, and 30% and 24% in the NL, with comparable losses in equivalent runs. Both leagues were also getting substantially older, as men in their 30s and 40s were more likely to rate 4-F.

The war ended in 1945, and most of the player-soldiers came back. They took back their jobs and relegated most of their replacements to entries in trivia contests, and dramatically brought the talent level back up:


AL 1945-46      15426   .251    24328   .264
NL 1945-46      15222   .246    24527   .263


The plate appearances lost by each league exceed the losses in any year during the war, but this time their replacements more than made up for them. The players coming into the majors in 1946, who had not played at all in 1945, actually outnumbered those who stayed on. Players from the 1945 AL made up only 47.1% of the league’s 1946 plate appearances. In the NL, the holdovers from 1945 only had 46.3% of the plate appearances. There are only three other times in baseball history when a major league failed to return 50% of its own PA from the previous year (not counting leagues that folded completely). After the 1878 season, the NL cut two of its six franchises, but then added four new ones for 1879, meaning that half of the teams were entirely new to the league. Meanwhile, in 1890, the establishment of the Player’s League decimated the rosters of both the NL and AA–but that’s a story for another article.

To assess the quality change of the leagues during these transitions requires a different approach. The numbers above are meant to be a broad overview of the situation–the migratory patterns, if you will, of major league players during the war. While it would certainly appear that more talent is going out than coming in, to really measure that we need to look at the players who stayed, not the ones who left. We need to look at how each player’s productivity changed relative to the league average–an especially important reminder in this case, since part of the game’s very nature was changed. Wartime material restrictions meant that genuine rubber was unavailable for something as mundane as a baseball center (especially since Malaya, the primary supplier, was in Japanese hands), and so balata was used instead; balata is a similar substance, in that it is made from a natural latex, but not nearly as elastic as rubber. League offensive rates dropped sharply in 1942, but then came back a bit as ball manufacturers adjusted their techniques to try and put some liveliness back in the ball, while the stolen base was briefly re-established as a weapon.

We also need to account for a player making up different proportions of the total population. In the statistics above, it didn’t matter if a player only had one plate appearance in 1942 and 675 in 1943, but now it will. Each player’s statistics are going to be scaled so that they contribute an equal number of plate appearances to each year, and the scale will be the lesser of his plate appearances in year 1 and year 2. The hypothetical player above would have all of his 1943 statistics multiplied by 1/675. So the process is:

  • Identify every player who played in both the leagues you’re trying to measure.

  • Get their EQA (or other stat, as long as it is a rate statistic, adjusted for league and park effects) for each league.

  • Set the weight to be the lesser of the player’s PA in league 1 and league 2, so that his EQA is weighted equally in each total.

  • Sum the weighted EQAs of all the players in league 1 and league 2.

If the weighted average EQA in league 1 is higher than that of league 2, we assume that league 1 was easier than league 2.

Here’s how this approach steps through the leagues a year at a time, from 1939-46:


              EQA1    EQA2   Ratio         EQA1    EQA2   Ratio
1939-40 AL   .2706   .2679   1.010   NL   .2696   .2667   1.011
1940-41 AL   .2676   .2684   0.997   NL   .2689   .2658   1.012
1941-42 AL   .2679   .2671   1.003   NL   .2681   .2660   1.008
1942-43 AL   .2626   .2654   0.989   NL   .2643   .2677   0.987
1943-44 AL   .2621   .2693   0.973   NL   .2666   .2723   0.979
1944-45 AL   .2616   .2642   0.990   NL   .2667   .2690   0.991
1945-46 AL   .2740   .2575   1.064   NL   .2732   .2638   1.036


“EQA1” refers to the first year, “EQA2” to the second year. If the ratio is greater than 1, it implies the league was getting harder, while a ratio below 1 indicates the league got easier. The biggest drop takes place between 1943 and 1944, and a huge rebound in 1946 brings everything back. If you assumed that both the American and National Leagues started in 1939 with identical .260 EQAs, then the following time-standardized EQAs are implied (dropping the decimal point, and adding an extra digit, and multiplying in sequence):


     1939    1940    1941    1942    1943    1944    1945    1946
AL   2600    2626    2618    2626    2597    2527    2502    2662
NL   2600    2629    2660    2681    2647    2591    2568    2660


Ideally, this would use multiple years to build the ratings. However, the direct one-year-to-the-next transition is almost always going to be the largest single component of any series, and is a lot easier to explain and show; the expanded calculation would only change the details, not the conclusion. The players of 1944-45 were playing in a league that was 10 EQA points easier than what existed at the start of the war. Ten points of EQA in terms of runs per out, is roughly 10%. So a player who was worth 100 runs at the start of the war, avoided the draft and didn’t change otherwise, would have been worth about 110 runs in 1944 or 1945–a full extra win. ERA is basically runs per out, so 10% is also a reasonable estimate for how much the average pitcher would have benefited from playing in the quality-depleted environment, which again means the pitcher would be rated as one win better over 200 innings of work. For comparison’s sake, the difference between Triple-A and the majors right now is around 30 EQA points and 30% in runs. So the depletion of WWII, while dramatic, did not come close to reducing the league to Triple-A level.

That’s something to consider when discussing player performances from the war years. Was Hal Newhouser‘s 1945 season “tainted” because the top talent in the league was gone? Of course it was, but even with 10% off the top it remains a great season, as demonstrated in part by his no-excuses 1946. Snuffy Stirnweiss dropped from a .314 EQA in 1945 to .256 in 1946–is that really as good as he was? League quality can only explain about 15 of the 60 points. The rest would have to be attributable to other things (injuries, or that his skills were remarkably well suited to take advantage of what was in many ways a return to dead ball conditions). And so it goes for Bobby Doerr, and Tommy Holmes, and everyone else who had a big year during the war. There were real quality deficits which gave them an edge, but that doesn’t mean they didn’t have legitimately outstanding seasons just the same.

Thank you for reading

This is a free article. If you enjoyed it, consider subscribing to Baseball Prospectus. Subscriptions support ongoing public baseball research and analysis in an increasingly proprietary environment.

Subscribe now
You need to be logged in to comment. Login or Subscribe