keyboard_arrow_uptop

We have all seen the merits of Major League Baseball teams using sabermetrics in its infancy. The famous bestseller, Moneyball by Michael Lewis, brought readers in to watch how the Athletics used statistical analysis to help repeatedly win the American League West despite a miniscule payroll at a time when few teams were using sabermetrics. However, in recent years many organizations have seen the competitive advantage of using sabermetrics grow smaller as more teams become aware and begin to target the players that the A’s were able to acquire so easily.

Although sabermetrics clearly has helped teams like the Red Sox and Rays become dominant in recent years, there have also been plenty of sabermetric teams being disappointments. Perhaps most famously, the Mariners were highly touted for their use of sabermetrics to put together an excellent defensive team going into this season, but instead have fallen deep into last place in the AL West. Dave Cameron at FanGraphs.com has endured much criticism for ranking the Mariners organization sixth in their probability of winning a championship in the coming years going into 2010. While Cameron has been criticized, I was reminded by something Joe Sheehan wrote in his final Prospectus Today column in December:

As far as the Diamondbacks and Indians go, I’m open to the idea that I’m systematically overrating "good" organizations, as I seem to miss on those teams to the high side with some frequency. I’ve certainly been accused of bias regularly, and I think there’s a case to be made that I have to be more careful about falling in love with a GM, a front office or a particular team’s offseason, and take a skeptical eye with teams that, in my mind, have a certain progressive seal of approval.

That makes a good deal of sense at the qualitative-analysis level, but what if overrating “progressive” teams occurred at the quantitative level? What if even PECOTA was overrating sabermetric teams?

I set out to answer this by surveying current and former Baseball Prospectus staff members and interns in an attempt to dig up exactly which teams are sabermetric leaning. I had my own guesses, but I left those out because I did not want to bias the results of tests. I wrote the following survey:

“I want to know about how much you perceive different major-league teams’ usage of sabermetrics. I’m going to list all 30 teams and I want you to label them 1-4 for their Sabermetric use over last five years (2006-2010) to your best knowledge where:

  • 1 = Does not use sabermetrics in decision making
  • 2 = Uses sabermetrics occasionally, but not as a regular part of their decision making process
  • 3 = Often uses sabermetrics to run their team
  • 4 = Employing sabermetrics is a regular part of decision-making for the team

If a team switched from being very non-saber to very saber-utilizing at some point between 2006 and 2010, just average out the results and split the difference. Please just mark a number 1, 2, 3, or 4 next to each team in this order.”

I got 13 answers, and although everyone seemed to have a slightly different definition of what it meant to be sabermetric, I was able to use the responses to get better results. To avoid differences in standard deviations from person to person, I set everyone’s standard deviation equal and made the average ranking 2.5 for everyone. 

Then I averaged out everyone’s answers to get the following ranking of teams by their perceived sabermetric usage:

Rank

Team

Saber Usage

1

Red Sox

3.93

2

Rays

3.79

3

Athletics

3.70

4

Indians

3.57

5

Mariners

3.29

6

Rangers

3.20

7

Padres

3.14

8

Yankees

2.98

9

Diamondbacks

2.91

10

Pirates

2.81

11

Cardinals

2.52

12

Blue Jays

2.47

13

Brewers

2.37

14

Angels

2.32

15

Nationals

2.31

16

White Sox

2.25

17

Rockies

2.20

18

Dodgers

2.15

19

Tigers

2.13

20

Cubs

2.06

21

Orioles

2.06

22

Mets

1.98

23

Twins

1.98

24

Braves

1.95

25

Phillies

1.92

26

Marlins

1.91

27

Reds

1.88

28

Giants

1.77

29

Astros

1.72

30

Royals

1.70

Then I took the average winning percentage from 2006-2010, and ran a quick correlation between sabermetric usage rating and winning percentage. The correlation was a notable, if not exceptional .10. Of course, removing the Pirates, who picked up sabermetrics when Neal Huntington took over as general manager in 2007 in an effort to pull the organization out of deep in the cellar, bumps that correlation up to .15. This is all despite a negative correlation between payroll and sabermetric usage (.18, between payroll rank from 1-30, and adjusted sabermetric usage). In fact, the amount that teams exceed their expected wins as predicted by payroll rank alone has a .27 correlation with sabermetric-usage ranking.

However, I also took the difference between the PECOTA projected winning percentage for each team and their actual wins from 2006-10. This was a solid measure of how much sabermetrics overrates teams. This measure correlated with sabermetric usage above at a very high level: .27. In other words, a large fraction of the error in PECOTA in the past five years can be explained by PECOTA systematically overrating teams that use sabermetrics. 

Consider the following table of the adjusted sabermetric usage rating combined with the average payroll from 2006-10, the average PECOTA projected wins and the average wins from 2006-10 (assuming that teams’ final 2010 winning percentage is equal to their current winning percentage).

Team

Saber Usage

Average 2006-2010 Payroll

PECOTA projected wins

Actual Wins*

Over-projection

Indians

3.57

68.2

87.2

77.2

10.0

Pirates

2.81

45.8

72.4

63.5

8.9

Diamondbacks

2.91

69.5

84.4

76.5

7.9

Orioles

2.06

82.3

73.6

66.1

7.5

Nationals

2.31

60.3

71.8

66.2

5.6

Athletics

3.70

63.6

84.0

80.3

3.7

Cubs

2.06

127.3

87.4

84.0

3.4

Mets

1.98

130.3

88.0

85.0

3.0

Brewers

2.37

78.6

84.0

81.1

2.9

Rays

3.79

52.5

84.0

81.5

2.5

Braves

1.95

92.4

85.0

82.9

2.1

Royals

1.70

67.3

69.8

67.9

1.9

Mariners

3.29

103.8

76.8

75.1

1.7

Red Sox

3.93

150.3

93.0

92.7

0.3

Dodgers

2.15

117.2

86.0

86.4

-0.4

Tigers

2.13

117.1

83.6

84.7

-1.1

Astros

1.72

101.4

76.6

77.9

-1.3

Yankees

2.98

214.9

94.6

96.5

-1.9

Giants

1.77

95.2

77.2

79.2

-2.0

Reds

1.88

72.4

77.0

79.5

-2.5

Phillies

1.92

117.6

87.4

90.0

-2.6

Cardinals

2.52

100.5

82.6

85.3

-2.7

Rockies

2.20

72.6

78.4

83.3

-4.9

Padres

3.14

58.7

77.2

82.1

-4.9

Rangers

3.20

74.6

77.2

82.4

-5.2

Marlins

1.91

35.2

74.4

80.3

-5.9

White Sox

2.25

104.3

77.0

83.2

-6.2

Blue Jays

2.47

82.5

76.4

83.1

-6.7

Twins

1.98

74.5

81.0

88.4

-7.4

Angels

2.32

114.2

82.6

91.6

-9.0

*Actual Wins assume final 2010 winning percentage is equal to 2010 winning percentage as of August 29

The correlation can be seen pretty clearly from the table. PECOTA is routinely overstating the Indians and A’s, perhaps most known for sabermetrics, as well as teams like the Diamondbacks and Pirates who certainly have used sabermetrics in their decision making. On the other hand, PECOTA keeps selling the Angels and Twins short, franchises routinely criticized for not utilizing sabermetrics.

Certainly PECOTA is not expressing a preference for teams like the Indians. PECOTA has no idea which team Moneyball or Mind Game is about. PECOTA simply knows the data it receives. However, teams that use sabermetrics heavily are likely to pick the same players favored by PECOTA, given the data they receive, and therefore those teams are likely to overrate the same players that PECOTA does.

However, more concerning for those sabermetric teams is that they miss on some of the players that PECOTA misses on. In Moneyball, Billy Beane repeatedly threatened to fire all of his scouts, but never actually did. Now, one of the more sabermetrically knowledgeable general managers is the Blue Jays’ Alex Anthopoulos, who has recently expanded his pro scouting staff and doubled his amateur scouting staff, significantly increasing the money put into scouting in an attempt to gain ground in that area, much like the early 2000s A’s were able to gain ground by using sabermetrics to acquire neglected players with high OBPs. The Jays have decided to marry sabermetrics with scouting, using both bases as a foundation to build a winning team in the hardest division in baseball to do so, the AL East.

This is exactly what teams need to do. It’s become quite clear that being sabermetrically savvy does not guarantee a competitive team. Certainly the Rays would not be vying for the top record in baseball on one of the lowest payrolls if they were not also employing a staff of brilliant sabermetricians, but there are plenty of saber-utilizing teams who are not anywhere near playoff contention. Teams need to use both sabermetrics and scouting to cover their bases on multiple fronts. Otherwise, a weak scouting staff will not be able to identify players that the sabermetricians are overrating.

 That is not to say that sabermetrics is not necessary. Sabermetric teams are doing better overall compared to teams that do not rely as much on sabermetrics, but it is clear that the market inefficiencies that sabermetrics revealed a decade ago are no longer as large and teams now need to approach building a roster from multiple angles. Anthopolous’ Blue Jays may prove to be the next generation—teams that use strong scouting and strong sabermetrics to build a winner. PECOTA only projected the Jays to win 72 games this year, a number they are close to eclipsing with a month remaining in the season. No one saw Jose Bautista hitting over 40 home runs; PECOTA only projected 18. Whether that was good luck, a strong scouting staff, a good sabermetric staff, or all of the above, the fact is that finding the players that the rest of the league are missing is essential and marrying scouting and sabermetrics is the best formula to get the job done.