Assessing chance quality has been a hot topic in the football analytics community lately and, though there are some differing opinions, one thing that most people seem to agree on is that it is fundamentally harder to score when using your head compared to using your feet. Anyone who’s watched or played the game will likely agree with this and so do I, but what I’ve come to realise lately is that headed chances are often rated unfairly, mostly due to a lack of more granular data.
The main inspiration for this insight came from my colleague Dave Willoughby (@donceno) a while back when he, as is his custom, challenged my rating of a certain headed goal, causing me to realise how my work with Expected Goals models had made me biased towards headers. He encouraged me to explore the true value of a headed chance using the detailed data we collect at Stratagem, and so here we are.
Before you continue reading, please be sure to check out Dave’s great blog post on how we currently rate all chances here.
As he stated in the above post: context is key. Indeed, when assessing headers this tends to be especially true, but is often overlooked due to the lack of more detailed data. So what type of data does Stratagem capture in order to better understand the context of a chance, besides the usual variables like chance location?
1. Defensive Players
How many defending players (including the goalkeeper) were there between the ball and the goal? This shouldn’t be thought of as a straight line between the ball and goal, but as more of a funnel shape moving out from the ball to the frames of the goal, incorporating every defender who could influence the strike with a block or by exerting positional pressure.
2. Attacking Players
How many attacking players were there between the ball and goal? Again, this would include every attacking player who could influence the strike, like causing a deflection on the original shot, moving into the goalkeeper’s line of sight or blocking him from making his dive.
3. Defensive Pressure
How much defensive pressure was the player taking the shot under? This is rated on a scale of 0 (no pressure at all) to 5 (significant physical pressure) and is often a key thing missing from most models. It is important because it takes the defenders’ position and their ability to hinder the strike from hitting target into account.
These variables are factored in together with more qualitative ones like the quality of the pass for example, to come up with the chance ratings Dave described in his post. I’m sure you all have a good feeling of the general effect these variables have on the quality of a headed chance – but let’s quantify them to explore how they all interact.
Using a basic machine learning logistic regression algorithm, I’ve created a simple Expected Goals model just for headers based on the above variables together with chance location. This is represented by whether or not the chance was inside the “prime” goal scoring area just in front of goal, what we at Stratagem call the “C1 Zone”. Data investigated is from all Stratagem league seasons starting in 2016, up until the 21st of November.
There are few surprises here, as most of you certainly would have guessed that Goal Expectancy decreases with more Defensive Pressure and more Defensive Players. In addition to this, the fact that the number of Attacking Players between the ball and goal at the point of strike has a positive effect makes sense as well, because they usually block the goalkeeper’s view and, unlike defenders, won’t try to hinder the strike. It’s also clear that headers are very sensitive to chance location, with the prime C1 zone enjoying a far higher Goal Expectancy throughout.
So it seems that the model and the above variables’ impact on Headed Goal Expectancy makes “football sense”. Nice. However, the main point to take away from this simple model is how much variation there is between the different scenarios, even from the same location. Take the most common scenario inside the C1 zone for example: Defensive Pressure 1, Defensive Players 1 and Attacking Players 0.
As can be seen from the green line in the lower-left plot, this chance would have quite a high Goal Expectancy, with the player heading the ball under very little pressure and with only 1 player between him and goal (normally the goalkeeper), meaning the model would give him about a 43% chance to score. However, if it would instead be 5 Defensive Players blocking the goal, the chance to score would drop to about 16%. Keeping the Defensive Players at 1 but increasing the Defensive Pressure to the maximum of 5 would also see the chance of a goal drop, this time to about 29%.
In the above plot with all variables ranging from 0 to 5, the model gives a Goal Expectancy range of 0.09 to 0.74 inside the C1 zone, while outside it would range from 0.04 to 0.50. This is significant variation that a model without these variables simply wouldn’t be able to pick up on.
Like I’ve already mentioned though, when Stratagem analysts rate chances they take not only the above variables into account, but evaluate the whole context of the situation. As a result, using these chance ratings should provide us with a better model. From now on, whenever I mention Goal Expectancy I will refer to a new model, based only on the chance ratings mentioned in Dave’s blog.
So who is creating the best headed chances? To limit the scope of the analysis, I’ll have a closer look at the two leagues I’ve worked the most with: Sweden’s Allsvenskan and Norway’s Tippeligaen. Before looking at Headed Goal Expectancy though, we have to take into account the number of headed chances created by each side. Also, as stronger teams tend to create more chances overall and this should reflect in the number of headed chances as well, regardless of playing style, we’ll have to normalise the number of headed chances created. So I’ve created a metric I’ll call Header Focus, which is simply the number of headed chances created divided by the total number of open play chances created (defined here as any chance that is not a penalty kick or a direct free-kick), as this would give an insight to which teams focus more on headers as a part of their attack.
Looking at the Header Focus for both leagues in the 2016 season, I’d say it passes the eye test. The teams at the top are all known to cross a lot and use target players up front, while the teams at the bottom end tend to keep the ball on the ground more. Häcken really stands out here with only about 7% of their open play chances coming from headers, which makes sense given Peter Gerhardsson’s preferred style of play and the type of players they have in the squad.
When looking at the mean Goal Expectancy per headed chance, a clear pattern becomes visible: Norwegian teams tend to create better headed chances, with only three Swedish teams inside the top half. Of course, it could also be looked at the other way, in that Norwegian teams are worse at defending headed chances.
The top teams in Norway Odd and Rosenborg are among the best, but surprisingly Start, who finished rock bottom, sit in sixth here. It’s also a bit shocking to see Djurgården so far down given their high Header Focus, indicating that they’ve relied more on quantity than quality when it comes to headed chances. Aalesund and Haugesund seem to have done the opposite with low Header Focus but high Goal Expectancy chances. Finally, Häcken, Östersunds and Sundsvall don’t seem to care about headers at all, combining low Header Focus with poor mean Goal Expectancy.
Given how the leagues share the same structure, we can also easily compare raw total numbers. The Norwegians top the chart here as well, with the same top three as with mean Headed Goal Expectancy. However, there are a few Swedish teams in the top of this ranking, and Norrköping and Djurgården’s high Header Focus clearly saw them able to rack up high total Headed Goal Expectancy numbers.
It’s maybe a bit startling to see bottom teams like Tromsø, Start and Gefle well inside the top half here, as they should be expected to struggle offensively, but it does make some sense because these poorer teams could look to use more long balls, crosses and headed chances to drive their attack. Logic would suggest that these are easier to create chances from for teams such as this, rather than playing their way through the opposition defence to set up high expectancy chances on the ground.
Putting all three metrics into one scatterplot really drives the point home in terms of how the two leagues differ when it comes to headed chances. The teams in Sweden’s Allsvenskan seems to be worse at creating high quality headed chances (note again that the inverse could also be true and that they could simply be better at defending), while the Header Focus varies a lot from the lowest (Häcken) to the highest (Norrköping).
In Norway’s Tippeligaen, however, teams’ Header Focus varies less and the numbers are generally higher, while mean Goal Expectancy per headed chance varies more, with most of the teams able to create high quality chances. So generally speaking, Norwegian teams seem to have a more deliberate focus on headed chances, as they both produce higher quantities and better overall quality.
The size of the dots in this plot represent the total Headed Goal Expectancy for the teams and demonstrate how a team needs to perform well in at least one of the two underlying metrics to produce high numbers overall. Out of all the teams, only Odd combined a high Header Focus with high Mean Goal Expectancy, indicating a deliberate tactical plan to use headers as an important part of their attack.
Finally, looking at individual players’ Headed Goal Expectancy per 90 minutes, we see how the top ten is entirely made up of strikers, which make sense as they are usually positioned in front of the goal at the receiving end of crosses. The Norwegians occupy the top three spots here as well, but what really stands out is just how dominating Odd’s target man Olivier Occéan has been with a value of 0.3 Headed Goal Expectancy per 90. In fact, his total Headed Goal Expectancy of about 8 made up over 70% of the team’s total production.
That’s it for now, but I will certainly dig deeper into headed chances now that there are more questions to answer than before I started this piece. To be specific, I wish to investigate how much Headed Goal Expectancy each team concedes, who is best at converting headed chances and finally look into the big issue I uncovered: just why are the Norwegian teams so much better at producing high quality headed chances? Hopefully we’ll find out!
Alexander Tanskanen (@zorba138)