x

Already member? Login first!

Comments / New

It matters where you start: adjusting Lightning Corsi ratings for zone starts

I’m continuing to look at the Lightning’s team and player metrics as we near the end of the season.

Occasionally, I get to surfing around the hockey stats areas of the web. The site of choice this week has been Driving Play, where just before the start of this season, they began looking at the effect of faceoff locations on Corsi ratings. And while you might not be surprised to discover that offensive zone faceoffs are good for your Corsi ratings (and defensive faceoffs are bad for them), the effect is actually enormous.

League-wide, there are between 15 and 20 times more events that get counted into Corsi (on-goal shots, missed shots, and blocked shots) on shifts beginning with faceoffs than on other shifts. And, of course, offensive zone faceoffs will generate vastly more Corsi events for than defensive zone faceoffs, which generate vastly more Corsi events against than offensive zone faceoffs. So players’ Corsi ratings are effected a great deal by how their coaches tend to use them.

If we could adjust a player’s Corsi rating to account for how frequently they start their shifts in the offensive zone, we could get more information about how they are doing compared to some hypothetical “average player.” And that’s what Zone Start Adjusted Corsi measure is intended to do.

Essentially, Driving Play did a bit of work that I’m not going to repeat here–you can read their articles here and here–and they came up with a formula that cleaves pretty closely to the actual distribution of this stat and uses data that’s readily available from behindthenet.ca.

Simplified Zone Start Adjusted Corsi =
Corsi/60 – [(Offensive Zone Start %-50)*0.18]

What this formula indicates is that for every percentage point above (or below) 50%, the Corsi of an average player on an average team will increase (or decrease) by 0.18. The authors didn’t indicate how they arrived at 0.18 Corsi events per percentage point, so I don’t have any way to judge the accuracy of that. I’m pretty sure I would have trouble understanding the explanation, anyway. So I’m trusting that this is a good number. If anyone knows of a better one–or can explain this one–please let me know.

I applied this formula to all of the Lightning players with 10 or more games this season, just to see if anyone’s performance stood out in either a good or bad way. Here’s the spreadsheet (in Google Docs). I’m not going to reproduce the whole thing in this post, but I do want to talk a bit about what I saw.

First, I ran this using both raw Corsi (CorsiOn in the parlance of BTN) and Relative Corsi (CorsiRel). [Note: I explained more about these two stats in February.] I did this partly because it wasn’t clear which of the two numbers Driving Play was using, and partly to see if there were any big discrepancies between the two statistics. If, for instance, a player moved up or down the list by a fair amount, and there’s no real explanation for the difference, then there might be a problem with the formula.

I ranked the players by each measure I was looking at, CorsiRel, CorsiOn, Zone Adjusted Corsi (ZACOn), and Zone Adjusted Relative Corsi (ZACRel), and took an average of those four rankings. In all of these measures, offensive players (those with higher zone start rates) should be clustered near the top and defensive players should be clustered near the bottom. If that doesn’t happen, there’s something wrong.

In both Zone Adjusted Corsi (ZACOn) and Zone Adjusted Relative Corsi (ZACRel), Marc-Andre Bergeron (Avg Rank of 2.5) is one of the top Corsi players on the team. He has been near the top of the Corsi lists all year, despite having played in only one game since the middle of January. He is second on the team in CorsiRel, 5th in ZACOn, and 3rd in ZACRel. He’s behind only Pierre-Cedric Labrie in raw CorsiOn (and Labrie has only 14 games with an average time on ice of 5.9 minutes–small sample). So far so good for our measure.

Most of the players tended to rank within a few spots on all of the lists. Bruno Gervais, for instance ranked between 2 and 5 on all of the lists (average 3.75) and Adam Hall ranked between 19 and 22 on all of the lists (average 19.75). This is more or less what I expected. Gervais is used offensively, while Hall was used defensively.

In general, the more a player has played, the more consistent his rankings were, and the closer his average was to his rank in Zone Starts. For instance, Trevor Smith’s numbers were all over the place.

Trevor Smith Rank Number
CorsiRel 12 0.9
CorsiOn 22 – 16.1
ZACRel 7 4.54
ZACOn 21 – 12.46
Avg Rank 15.5
O-Zone Start % 29.8 23

The disparity between these numbers may be a function of the small amount of playing time he has had (16 games at 10.95 TOI/60). The randomness inherent in shot metrics may simply not have had time to smooth out yet, like it will as a player puts in more time. It is important to note, also, that the disparity is there in the input data from BTN, and probably isn’t a function of the zone adjustment formula.

In addition, players’ Zone Adjusted Relative Corsi rankings tended to be closer to their unadjusted Relative Corsi rankings than their Zone Adjusted CorsiOn rankings were to their unadjusted CorsiOn rankings. In other words, players tended to move around less when their CorsiRel was adjusted for zone starts than they did when their raw Corsi was adjusted. This is in keeping with the greater variability of raw Corsi than CorsiRel.

So, since the formula seems to create rational results that coincide with what we see in the real world, we can draw some conclusions about how Lightning players are performing compared to that hypothetical average player on an average team.

Players outperforming their zone start rate:

In addition to Trevor Smith, we have

Player Zone St % / Rk Avg ZA Rk
P. C. Labrie 37.9 / 16 1
Ryan Shannon 43.6 / 14 3
Victor Hedman 36.8 / 18 8
Mike Commodore 43.9 / 13 6

Both Labrie and Smith are, as noted above, probably as high on this list as they are due to the their smaller amount of ice time. Mike Commodore’s inclusion is surprising, however. and bears further examination. [Note that his statistics include his time in Detroit, where he was used more sparingly and in different situations. That might be enough to skew the results.]

Players underperforming vs their zone start rate:

Player Zone St % /Rk AVG ZA Rk
Keith Aulie 44.6 / 12 24
Brendan Mikkelson 59.5 / 3 14.5
Brett Clark 45.4 / 11 18.5
Brian Lee 50.5 / 8 13
Brett Connolly 49.4 / 9 14

Among these players are two (Lee and Aulie) whose statistics include time on other NHL teams, possibly skewing the results somewhat. I would think that if there is such an effect, it would be strongest for Lee, who played twice as many games with the Ottawa Senators as he has with Tampa Bay. Aulie is split about half and half between Toronto Maple Leafs and the Lightning at this point. I don’t see any obvious sample issues with the other three players, however. although Connolly’s up and down ice time may have affected his stats to some extent. At this point, though, he has 63 games played and is averaging just over 10 minutes per game.

I am not surprised to see more defensemen “underperforming” than forwards and more forwards “overperforming” than defensemen. Defensemen tend to have lower Corsi numbers than forwards in any case, and of these 6 players only Mike Commodore and Brett Clark can be considered veteran blueliners. I would expect to see young defensemen making more mistakes than more experienced ones. Hedman, as we have determined in other instances, is an exceptional player. He is also facing tougher competition than any defenseman on the team besides Eric Brewer.

Some final notes about zone adjusted Corsi. While it takes into account both time on ice and zone starts, it does not take into account what kind of competition players are facing. And it doesn’t control for the team’s play in general. Teams that on the whole give up more shots than the average (hello, Lightning!) will have players whose Corsi ratings are lower across the board, and we shouldn’t pin the totality of a team’s bad defensive play on individual players.

These two metrics do give us a more contextualized look at player performance, however. And the more information we have in evaluating players, the better our assessments will be.

If you enjoyed this article please consider supporting RawCharge by subscribing here, or purchasing our merchandise here.

Support RawCharge by using our Affiliate Link when Shopping Hockey Apparel !