If you follow the link you will hear Dan Patrick – now of Sports Illustrated – interview George Karl (hat tip to Andy Feinstein of Denver Stiffs – formally FireGeorgeKarl.com). In the course of this conversation Dan Patrick re-iterated a common critique of the Denver Nuggets: Denver doesn’t play defense.
Denver allowed 107 points per game, a mark that ranked 29th in a 30 team league. So it’s easy to see why people think Denver has problems on defense.
Denver is a “Good” Defensive Team
At least, it would be easy to see if you didn’t know one basic fact about basketball. Some teams play at a fast pace while others take it slow. This one basic fact means that to properly evaluate a team’s offense and defense we need to consider points scored and allowed per possession. In other words, we have to consider offensive efficiency and defensive efficiency.
The Nuggets average 103.1 possessions per game, which is the fastest pace in the NBA. Given this quantity of possessions, the Nuggets had a 103.7 defensive efficiency [(107.0 / 103.1)*100]. This mark ranks 12th in the 30 team NBA. In sum, Denver was actually an above average defensive team in 2007-08.
If you have read The Wages of Wins you have seen the argument that teams should be evaluated in terms of offensive and defensive efficiency. And such an argument is not unique to The Wages of Wins. It can be also found in the writings of John Hollinger and Dean Oliver. And I believe Oliver notes that this idea goes back decades. Yet apparently, it’s completely lost on Patrick.
Okay, Patrick is a sportswriter. Sometimes (as is often noted here), sportswriters get it wrong (see the vote for Kevin Durant for Rookies of the Year or Kobe Bryant for MVP). But what is truly amazing about the Patrick-Karl interview, is that after Patrick asserts that Denver is “bad” defensive team (where “bad” is defined as below average), Karl doesn’t disagree. One would think that Karl, who has to have heard that teams play at different speeds, would have quickly told Patrick that he was wrong. Once you adjust for pace, Denver is a “good” defensive team (where “good” is defined as above average). This, though, doesn’t happen. Is it possible that Karl doesn’t understand offensive and defensive efficiency?
Karl stated in the interview that Denver is beginning the process of preparing for next season. That process involves an evaluation of where this team is at, and what needs to be done to make it better. If you start such a process, though, with a distorted view of what is “good” and “bad”, it seems unlikely that you are going to end the process with an improved product.
Introducing the Distortion Score
To be fair to both Patrick and Karl, NBA observers do commonly refer to points score and allowed, rather than offensive and defensive efficiency. And as has been noted in many places (including The Wages of Wins), this common practice does distort our view of a team.
How much of a distortion do the common metrics create? To answer this question, I ranked each NBA team according to the following metrics:
- Points Allowed
- Points Scored
- Defensive Efficiency
- Offensive Efficiency
I then calculated the difference (labeled Defensive Difference) between the team’s defensive efficiency rank and its points allowed rank. Likewise, I also calculated the difference (labeled Offensive Difference) between the team’s offensive efficiency and points scored ranking. With these differences in hand, I calculated the team’s Distortion Score. This is determined as follows:
Distortion Score = Absolute Value of Defensive Difference + Absolute Value of Offensive Difference
The results for 2007-08 are reported in Table One
As Table One reports, the team with the largest Distortion Score is the Denver Nuggets. As noted, the team is ranked 29th in points allowed but 12th in defensive efficiency. So Denver’s defense is very underrated. On offense it has the opposite problem. In terms of points scored the Nuggets rank 2nd in the league. In terms of offensive efficiency, though, the team is only ranked 11th. Yes, Denver is above average offensively. Just not as far above average as its points scored per game would suggest.
When we add together 17 (the Defensive Difference) and 9 (the Offensive Difference), we see a Distortion Score of 26. And as noted, this leads the league.
Denver, though, is not the only team whose performance is distorted by points score and points allowed. The Indiana Pacers ranked 26th in points allowed, but were 15th in defensive efficiency. On offensive, the Pacers ranked 7th in scoring per game, but 18th in offensive efficiency. So the standard metrics tell us that the Pacers were very good on offense and very bad on defense. The pace adjusted measures tell us that Indiana was actually a better defensive team .
Indiana was not the only team where the efficiency measures tell us the opposite story from what we see when we just look at scoring averages. Like Indiana, points scored and allowed tell us that Seattle and Charlotte were better on offense while the efficiency metric says the teams were stronger on defense.
For Portland, New Orleans, Dallas, Toronto, Washington, and Atlanta, we see the opposite story. These teams look like they are better on the defensive side of the ball. But when we look at team efficiency, we see that each of these teams is actually stronger on the offensive side of the ball.
Why this Matters?
The purpose of the Distortion Score is to highlight how much the traditional measures of points per game distort our assessment of individual teams. And by that measure, the view of Denver is the most distorted.
Let me close by noting again why all this is important. If a decision-maker wishes to improve an organization, the decision-maker must first know what is wrong. If your metrics, though, distort your strengths and weaknesses, then it really doesn’t help much.
For Denver fans, let’s hope that George Karl didn’t feel like correcting Dan Patrick on his show. If that isn’t the case, though, it looks like Denver’s efforts to improve this summer are off to a very bad start.
Our research on the NBA was summarized HERE.
Wins Produced, Win Score, and PAWSmin are also discussed in the following posts:
Finally, A Guide to Evaluating Models contains useful hints on how to interpret and evaluate statistical models.