I’ve been on a little bit of a park factor kick as of late.

Last week was spent checking to see if groundball pitchers were less affected by park factors than flyball pitchers are, a theory based on the assumption that park factors are based largely on outfield dimensions. This turned out not to be the case. Months before that was a little foray into park factors and baserunning attempt and success rates, checking to see if perhaps home teams got some of their inherent advantage from knowing how the ball bounces in their yard better than their visiting opponents do. Again, the theory did not pan out.

I’m back for more. At the heart of what I was trying to do last time was the idea that park dimensions are a central part of park factors. But because I was trying to take something absolute–the outfield dimensions–and map it to something relative–the park factor–via groundball/flyball numbers, it occurred to me that there would be problems with that kind of analysis when dealing with multiple seasons. Even single seasons can be messy when the Mariners do annoying things like open a new park in the middle of July.

As some parks change and new ones open, park factors change. The Oakland Coliseum used to have a bit of a reputation as a pitchers’ park, but in the last few seasons it’s actually started to play as a slight hitters’ park. There are many possible reasons for the change: the construction of Mt. Davis (the monstrosity that replaced the outfield bleachers) when the Raiders moved back to town, earthquakes, general changes to wind patterns, and–perhaps most probable–the construction of other parks in the league that changed how the Coliseum is perceived in relation to them.

The big problem in finding some truth is that so many new parks have opened in the last 15 years that park factors haven’t had any chance to steady themselves. Each year, a new park or two changes the mix. So if we were trying to determine if the construction of Al Davis’ monument to litigation had an effect on the Coliseum’s park factor, we would first have to correct for all the other parks being built around the league during that time. Mapping any specific features to the park factor is made significantly more difficult because even those parks that don’t change have park factors that do.

This begs the questions: can we determine how the new parks are changing the overall league performance level? Are some of the offensive numbers we’ve been seeing as of late a result of a series of hitter-friendly ballparks being introduced around the league?

I would wager that to some extent we can answer these questions by looking at how the park factors of established parks have changed as the new ones are introduced. The only problem is that there are so few parks that haven’t changed over the past 20 years that the size of the control group is very small. Counting Mile High Stadium and Coors Field as one park, and disregarding the Expos’ foray down to Puerto Rico, there have been 19 new parks introduced in the last 16 seasons, not to mention the changes to the Oakland Coliseum and the massive renovations to Angel Stadium. To further complicate matters, Tampa Bay and Minnesota switched from AstroTurf to FieldTurf, Comerica Park and Kauffman Stadium have both moved their fences, and both Sox teams have renovated parts of their parks, either adding or removing seats. You can probably guess which team added and which removed.

Over the last 20 years, the only parks that have remained virtually unchanged are Shea Stadium, Yankee Stadium, Dodger Stadium (though they’re adding new seats this winter), Wrigley Field (discounting the addition of the lights) and Busch Stadium. Clearly there have been small changes to many of these parks, but nothing large enough to affect park factors that much.

To start, let’s look at how the park factors of these five stadiums have been affected by the introduction of new parks to the league. I’m using the same method for park factor calculation as in previous articles; namely, the stats for a team and their opponents at home divided by their stats and those of their opponents on the road, both adjusted for playing time. Also, because four of the steady parks are in the NL, it’s going to be significantly easier to determine the effect of new parks in that league than in the AL. Finally, I’ll be using one-year park factors for the parks. Usually three- or five-year park factors are used because they aren’t as affected by minor changes in things like weather and luck and by increasing the sample size, they are closer to the “true” park factor. Because there are four parks to be considered in the NL, using one-year park factors isn’t so bad because the sample size is four times as large and more immediate results of the introduction of new parks should be evident.

Here are the average park factors for Wrigley Field, Dodger Stadium, Shea Stadium and Busch Stadium from 1985-2004 for runs:

The first new parks introduced in the NL after 1985 were Pro Player Stadium and Mile High Stadium, both in 1993. The group of steady parks looked to be around 1.00 or a little higher until around 1993, then came crashing down to .95 and lower. This would intuitively make sense: the introduction of an extreme hitters’ park, in Denver, would make other parks appear to favor the pitchers more. There’s only one problem: that drop from 1.02 to .94 happened in 1992, the year before those parks were introduced. On the flip side, there was a similar adjustment downward in 1987 (from .99 to .94) with no apparent changes in parks, so it’s possible that the crash of 1992 was due to similar randomness. What cannot be ignored is that the park factors for Mile High Stadium were 1.46 and 1.28; ProPlayer’s factors were 1.02 and 1.17. Thus we can pretty safely say that the continued downward trend after 1993 was probably due to the new parks.

The next new NL park was Turner Field in 1997, carved from the Olympic track stadium in Atlanta. The next year, Bank One Ballpark joined the mix, followed by Minute Maid and SBC Park (née Enron and Pac Bell) in 2000. The replacement of Fulton County Stadium–long thought a hitters’ park–with the more neutral Turner Field is held up by the numbers here as the steady parks jumped from .90 to .97 upon its introduction in 1997. In 1998, they dropped back to .94 when the BOB was opened. Then later in 2000, after climbing back to .99, they dropped all the way to just under .90 with Enron and Pac Bell. Finally, at the end of the timeframe, they’ve climbed as The Great American Ballpark, PETCO Park and Citizens Bank Park have opened.

That’s 11 new parks in 12 seasons. Although the trend may appear to move downward in the chart, removing all seasons prior to 1992 (the year before the new parks began opening) yields a trendline that’s virtually flat. In the NL, the collective opening of all those new parks does not appear to have significantly affected the park factors of those that remain. With a few more years of data, the results would certainly be more conclusive, but it doesn’t look like all the new parks can be blamed for the offensive surge of the 1990s in the NL.

As mentioned, in the AL only Yankee Stadium had gone virtually unchanged since 1985, but looking only at Yankee Stadium’s one-year park factor would likely have too much noise to infer any real conclusions from the data. So I’ve added Fenway Park, Kauffman Stadium and the Metrodome, since all of them were mostly unchanged until 2004. Looking at those four parks yields the following:

Things look much the same here. There was a steady decline up until about 2000, then a decent increase. In the AL there was a crash in 2004, but that year of data can be thrown out since we’ve conceded that the parks we’re using did physically change that year. SkyDome, New Comiskey and Camden Yards don’t appear to have had much affect on things, but when Jacobs Field and The Ballpark at Arlington (Ameriquest Field) were introduced in 1994, the steady parks showed a sharp decline. Things stayed pretty even until about 2000 when Comerica opened. Since then, things have been right back where they were in the late 1980s.

As a last pass, here are park factors for both sets again, but looking at home runs rather than runs:

The NL park factors–besides being all over the place–may be trending slightly upward as the newer parks have been introduced in 1993 and later. For the most part, there doesn’t seem to be much change one way or the other. On the other hand, the AL park factors have been showing a steady trend upward over the last 20 years despite dips in 1993 and 1999. Assuming that’s an indication of the effects of other parks in the league, that means it’s actually more difficult to hit home runs in the AL now than it was 10-15 years ago.

There are a number of assumptions built into this analysis. Primarily, there’s the assumption that if all parks remained exactly the same their park factors would not change, or would at least fluctuate within a small area. Additionally, by keeping the steady group of parks in each league separate, I’ve ignored the possible effects of interleague play on park factors. Most likely, this would make NL parks look slightly more pitcher-friendly since games in AL parks include the DH instead of the pitcher batting. In fact, quickly checking the data, the AL showed a drop in both run and home-run park factors in 1997 while the NL showed a significant jump. This adjustment would have to be further explored before drawing any more substantial conclusions.

On the whole, these park factors appear to show an overall increase in run scoring in the NL, as those park factors have decreased with a steady, if volatile, home-run rate. By contrast, it appears harder to hit home runs in the AL while run scoring has returned to late-1980s levels after a dip through most of the 1990s. While it would be nice to blame the new, smaller or irregular ballparks for the increase in scoring over the past 10 years, it’s difficult to find data to support that in these numbers.