What does the “Collingwood 11” really mean? #AFL

This week there has been a lot of lamentable confusion and conjecture around the reporting of the illicit drug positive test rates among AFL players. HPN has also noticed that there hasn’t necessarily been a lot of fact involved in the scrum of opinion. So what are the facts around drug use in Australia, and how it relates to football clubs? Good question, imaginary reader. Primarily, HPN will tackle three commonly raised opinions aired this week:

  1. drug use is on the rise generally in Australia,
  2. the reported drug use in football clubs is very high, and
  3. drug use by AFL players is higher than among other young men.

Background

Hair testing of AFL players in the offseason is a new policy in the AFL, and appears to be a data collection exercise to observe baseline rates of illicit drug use. The testing has been stated to be voluntary and was supported by players through their union (the AFLPA). This information is necessary to have as part of a health-centered drug policy, especially since the AFLPA and AFL appear to want to use the data to help protect and support players’ physical and mental health.

The full expectations of this collection were that it be private, and it’s a shame that it has not been kept that way. The situation harkens back to the acquisition and attempted publication of medical records in 2007 of Hawthorn players relating to drug use by Channel 7.

However, what has been found is that usage by players is probably less than the background rate for their non-football playing young male peers. The headline claim is that 25% of Collingwood’s list (11 of 44), tested positive in hair tests designed to cover their off-season period. This is not is a revelation of anything shameful, exceptional or even particularly dangerous.

Is illicit drug use in society on the rise?

Some of the commentary this week gravely linked these test results to a supposed drug use epidemic in broader society. David King’s comment was an example here, claiming “There is no one in the world that says the use of drugs is on the decline.”

This is an incorrect perception. Use rates for illicit drugs in Australia have likely declined since the late 1990s, but at the very worst remain steady.

First up, a headline grab. A full third of Australians have used illicit substances (or rather, have been willing to report such use to a survey). For this post we’ll be using the Australian Institute of Health and Welfare’s data from the National Drug Strategy Household Survey, which is based on about 20000 respondents.

drug use 1

That fact alone should make us question whether it’s a particularly big deal if a quarter of a football team, on its private downtime in the off-season, is found to have done so too.

It should be noted the bulk of this usage rate is use of Cannabis. If we break down lifetime use by substance it looks like this:

drug use 2

Cannabis almost entirely drives overall reported usage rates, with no other illicit drug having ever been used by much more than 10% of the Australian population. What we observe here is that heroin use and methamphetamine use has dropped in recent years. Hallucinogens and misused pharmaceuticals moved up and down with some volatility. In particular, the numbers who have ever tried cocaine and ecstasy have risen.

However, lifetime use is a slow and lagging indicator since people tend to live for a long time. If we look just at recent use (ie, in the last 12 months) of some of the major drugs, a slightly different picture emerges:

drug use 3.PNG

Here we can see ecstasy use actually declining as an ongoing recreational activity after peaking in the 2007 survey. The lifetime rates have still gone up as more people in total have used it, but recent users don’t show that same pattern. Cocaine use here also looks like it may have steadied.

Alleged use by footballers is in line with, or lower, than use by other young men

The tables above all talk about the whole of the population, but young men have the highest use rates for every illicit substance. If we look at use rates for young men specifically, we can see that 32% of men aged 20-29 admitted to using an illicit substance in the last year.

Hair testing should be considered a good measure of total illicit drug use in the AFL. It has something like a three month window of detection and therefore covers an entire off-season. It’s also reasonable to assume that a football team confines pretty much all their annual illicit drug use to the brief period of the off-season, if only because there’s random in-season testing and the major illicit drugs are all prohibited by WADA in-competition.

The only conclusion to be drawn then, is that Collingwood’s reported annual illicit drug use rate of 25% of their list (11 of 44) is lower than the reported use rate among the general population of young Australian men. If the majority of other clubs have lower positive tests than that, then they are doing even better.

drug use 4.PNG

Now it may be that the composition of use of illicit drugs is different for football players than for the general population. They may be using less cannabis and more ecstasy or cocaine, for instance. The reporting of specific substances has been vague (and perhaps deliberately ambiguous) in that regard.

But even if so, it’s probably not a terribly big problem or particularly harmful (it’s actually probably less harmful than alcohol, really, but that’s a topic for another day). The bottom line is that if clubs are relatively happy with the piss-sinking binge of Mad Monday, there doesn’t appear to be a lot of reason for being more concerned that players are, say, hitting party drugs at a music festival.

It’s possible that the AFL is rife with heroin and methamphetamine abuse, of course. But surely it’s more likely that most use during the off-season is more like the typical use observed in surveys of other people – cannabis and party drugs used occasionally, situationally and sporadically.

Conclusion

Our point here has been to just to try and contextualise the sort of numbers being thrown around and counter some of the “illegal and evil” rhetoric some well-meaning pundits have given us.

Football players testing positive for unspecified illicit substances during their off-season isn’t a cause for a big moral panic. At least a third of Australian admit to having used illicit drugs in AIHW surveys. More 20-29 year old men outside football clubs used illicit drugs in the last 12 months than those within them.

Footballers and the AFL are right to treat illicit drug use as a physical and mental health issue and a professional performance issue, similar to how they regard recreational drinking. If clubs and playing groups are concerned about the impact of drugs or alcohol on performance and want to voluntarily agree to pursue a zero-tolerance approach even out of season, that’s their right and their business.

But for the public, and especially for people in the football media, we need to calm down.

HPN’s Statistical 2016 AFL Ladder Prediction #afl #aflstats

We’ve already run our eye over the 2016 Super Rugby seasons, and the current NRL season, so HPN decided to throw caution to the wind and attempt to predict the 2016 AFL season.

Like last season’s prediction, this projection shouldn’t be used as exact gospel on how a team should perform in 2016, but more of a general indicator on whether they should improve or decline compared with last season. The method deliberately constrains itself to a few measurable thing and, most notably, entirely omits new recruits and departures from consideration.

The method relies more on the simple than the statistically complex, however it did a reasonable job at predicting the movement of teams last season (with a few notable exceptions). After last year’s results, we’ve made a few tweaks to the formula that we hope will increase the accuracy. To kick things off, here’s what we came up with:

2016proj.PNG

To compile this prediction, we have relied upon three measures: strength of team, strength of schedule and expected improvements/declines of each list due to aging.

Strength of Team

To account for the true strength of each team, we will take their 2015 performance at the key indicator and make a couple of minor adjustments.

Firstly, we looked at each teams’ results in close games last season. After an analysis of all close games within the last three years, we have found that there appears to be no real discernible pattern to a team’s performances in close games.

Close games averaging out to 50:50 means that teams who performed exceptionally well or poorly should expect to be subject to regression to the mean. We’ve given each team “normalised” luck in projecting their 2016. For example, Sydney, who went 4-1 in close games last year, can expect to do 1.5 games worse in such contests if their luck regresses to the mean.

Strength of Schedule

Recall that we used Strength of Schedule to assess the 2016 fixture’s difficulty in this post. Here we’ll use that same assessment to adjust expected wins for each team. Here’s a nice little table to jog memories:

Strength of Schedule impacts, 2015 and 2016

For example, Adelaide can expect to do nearly a full win worse this year, all things being equal, just by virtue moving from a slightly soft draw to one of the roughest in 2016.

Effect of aging

Finally, we will look at the age profile of each list, and what teams will benefit (or suffer) due to the model aging curve. Some time ago, we calculated the expected output for each age cohort of AFL player.

In these projections we’ve applied this output curve to all players on each list with 15 career games under their belt and who were at the club last year, to measure the expected improvement from a pure “cohort ageing effect” in their continuing players. Below is a chart of that age effect in relative terms, for each team in the league:

agecohorteffect

Noteworthy here is that Carlton appears to have a continuing list that is already old enough to have little expected to improvement than most other teams. They’re in a category here with the likes of contenders North Melbourne, Fremantle, Hawthorn and Sydney. That suggests either Carlton are in a deep hole with ageing dud players and a lot of renewal to do, or that their list has underperformed and could do a lot better just by virtue of its experience.

Last year, we believe we underestimated this effect, and as such we’ve adjusted our formula. Fundamentally, we believe that a 2% relative increase in output looks like it should provide have more than a 2% increase (about half a win) in total wins. This year, using the age effect we derived earlier, we applied the effect to each player on the senior list at the club. We have used the maximum senior list number of 40, as each club is able to have that many players on the senior list at any given time. The formula is as follows:

Season Age Effect = (Number of players on senior list ^ (1 + Relative Age Effect)) – Number of players on senior list 

Where last year the maximum impact of the age cohort effect was about half a win, this year it’s around to two or three wins for the outliers. We’ll see if that tracks better over the year.

What we don’t measure

Please note here that our age effect is a measure only of continuing players – new players and departing players are excluded from this effect. Indeed, we have made no attempt to measure the impact of new recruits in this projection as a whole. This leaves a big Dangerfield-sized hole in our simple model, which must necessarily be filled in by subjective judgement beyond its scope.

Similarly we do not account for improvement coming from players with less than 15 games under their belt. There’s a decent chance some of these recruits (Callum Mills, or Jacob Weitering, for instance) make a difference but again, that’s beyond our scope.

The Essendon problem (and other last steps)

The most extreme example of the blindness to player change is Essendon. Thanks to their suspensions, Essendon are essentially a different team in 2016 to what they were in 2015, they present an interesting issue. Do we take their value as based upon the 2015 season, or instead impute a more realistic base value?

As our initial take on the suspensions noted, Essendon, in raw terms, lost half of its output from the 2015 AFL season. Going back to our data on cohorts of continuing experienced players, no other team underwent this amount of turnover of key players in the offseason, or even close:

listcomposition

The simplest approach for us is to use the rough assumption that Essendon will be roughly half as strong in 2016 as they were last year to estimate their list strength in the place of their actual 2015 record.

Yes, they’ve replaced the suspended 12 but, coming from outside the AFL, those should be expected to perform similarly to the weakest players already in the system and offer basically no improvement. This is rough, and not exactly “best practice”, but is the fairest result we think we can achieve.

Finally, we have adjusted all of the results to make sure they represent a full AFL season of 198 games, which just involved some slight rounding at the last step and means numbers won’t quite add up.

Risers

Improvement on 2015 wins, rounded to the nearest half win

Port Adelaide (+3 wins)

We had Port in this column last year too, which didn’t work out well for us at the time. Port has a relatively soft draw this year after what turned out to be a murderer’s row of doubleups last year (that said, they rolled Hawthorn and lost to Carlton so draw isn’t everything). Like the other sides on this list they are a team that should be improving with age. We get them riding these factors back into the top 4.

Collingwood (+2.5 wins)

Collingwood lost a lot of close games last year, winning just one of seven games decided by less than two goals last year. They also have an ever improving list, one of the faster projected improvers in the league. They’ve recruited a number of young contributors in the offseason, but even without that being factored in, HPN predicts that they will do substantially better than last year as a result, and push for a spot in the finals (which we have being at 13 wins for the first time ever).

GWS (+2.5 wins)

GWS are in an enviable position right now – a team on the rise like few others in spite of a tougher draw. They have depth at almost every position, and more former first round picks than they know what to do with. We see their rise being meteoric this year, with this season their side to make the leap into the eight and potentially the four. The major complication to this is the loss of Cam McCarthy (likely for the season at this stage) and Jeremy Cameron for the first month. If they can do enough to replace their output for the first month, they’ll be on right track for this year.

 

St Kilda (+2.5 wins)

Firmly into their rebuilding project, St Kilda look to build on their unexpected improvement last season by moving another notch up the ladder. The Saints have an interesting blend of aging stars and emerging talent, but whether the blend is there this year remains to be seen.

 

Melbourne (+1.5 wins)

On pre-season form, the Dees should be firmly pencilled into the eight, but as we’ve written previously the pre-season means jack. Melbourne has a promising young list, and should be a contender to make the next step to finals contention. However right now we have them at the head of the bottom pack of teams, along with the Saints.

Brisbane (+1.5 wins)

Brisbane have a lot of issues, but their midfield is pretty good, at least on paper. Their projected improvement here is almost entirely driven by developing youth. If they can get some semblance of a tall target up forward, find a working defence, do a bit better in close games (or at least get in a position to be in close games) and keep their best 22 on the park, they should improve this year. It won’t be pretty, but better than the ugliness before.

Gold Coast (+1.5 wins)

To misquote Rodney Dangerfield, the Suns got no respect last year (from opposing teams). They stank the joint up, due to injuries and indiscretions. Given those things we can’t get a subjective read on how they’ll do this year. Not a clue. It’s hard to see them doing worse than last year though, especially with a couple of teams that seem destined for the bottom of the ladder.

 

Sliders

Decline from 2015 wins, rounded to the nearest half win

Fremantle (-4 wins)

Fremantle were the best team last season in close games, but as we said earlier, they are unlikely to repeat this performance this season. In tight games last year (less than 12 points), Freo went 5-1 – which made a large contribution to their minor premiership. The effect of aging also hits the Dockers, as players such as Sandilands and Pavlich head closer to the end. We have them on the edge of the eight, but by a very small margin.

North Melbourne (-4 wins)

North’s predicted decline in 2016 happens for almost exactly the same reasons we have predicted a decline for Fremantle above. Unsustainably good in close games? Check. An aging team? Check. But North have with the added disadvantage of a tough draw. This may seem a little lower than most predictions have for the Roos, but everything seems to be indicating a slide.

Sydney (-2.5 wins)

The final team in the old and lucky trilogy. All three teams could outperform expectations, but if you’re looking for sliders from last year’s finals series these three teams are a good place to start.

Essendon (-2wins)

¯\_(ツ)_/¯

Adelaide (-1 win)

Adelaide slides slightly due to their tough draw and record in close games last year, but with Collingwood, Fremantle and Sydney should fight for a spot in the finals. Note again that we have not incorporated any Dangerfield related variables into this projection.

Geelong (-1 win)

Geelong were really lucky in close games last year, which masked their downfall slightly. Their offseason recruitment is widely expected to help their performance this year, and that’s something we simply have not incorporated into this simple projection. In addition, benefits obtained from the maturation of their list should be only about middle of the road. The factors we include here suggests that they shouldn’t be expected to do much better than they did last year, so any improvement will come from their new headline recruits.

Hawthorn (-1 win)

Hawthorn are getting older, and their draw is always going to be tough as long as they continue to finish at the top of the ladder. That is the weight of success. They will continue to do well.

HPN’s Statistical 2016 NRL Ladder Prediction – Or how stats can’t account for off-field incidents #NRL

IMAGE$38CF0B197B90A46A

If you’ve clicked on this article from another website (such as Twitter or Reddit), you will have likely seen the lead image as a thumbnail. Yes, the one with the much maligned Roosters as the eventual minor premiers. No, we aren’t trolling.

Even as I type this the odds of a dominant season by the Roosters slip by the wayside. In the off-season the side lost Michael Jennings, James Maloney, Nathan Stapleton and Roger Tuivasa-Sheck, with Mitchell Pearce under a continual cloud from his brief stint in short film. In their stead, the Roosters were able to recruit Joe Burgess (an England rep), Dale Copley (an emerging Maroon) and journeyman hooker Ian Henderson. While that’s a decent replacement haul, it’s not exactly like for like. At the time of writing, Sportsbet have the Roosters as the 10th favourite for the minor premiership. We at Hurling People Now respectfully disagree with this dramatic fall from grace, but perhaps not entirely.

Before digging ourselves further into a hole, let’s explain how we got to such a point.

Methodology

How HPN predicts ladders can be explained pretty simply in three parts:

  1. Strength of team last year;
  2. Strength of schedule this year; and
  3. Influence of “lucky results” last year.

Let’s break those down now.

Strength of Team

Contrary to common perception, there is a limited amount of movement in any teams’ 25 man primary squad. As indicated in the chart below, the vast majority of every NRL team’s 2015 squad stayed intact from last year.

2016 NRL Players Lost

In general terms, 77% of each teams’ squad has stayed in place from last season. Roughly speaking, this indicates that a team’s performance last season should be generally indicative of their performance this season. This simple analysis doesn’t account for the quality of players lost or gained, however the implementation of a relatively small salary cap should mitigate this factor.

In addition to just win/loss records to last year, we’ve also added Pythagorean win/loss, an expected win/loss measure. Essentially this evaluates how many games a team “should” have won based on their scoring and conceding of points. This previous (AFL) post explains the theory more, and we’ve loaned the exponent from Tony Corke at the Matter of Stats Blog (1.89 for those playing at home).

Team strength based on 2015 win/loss and on Pythagorean expectaions

IMAGE$CB4CAAB0482B0E

Strength of Schedule

The 24 game NRL season means that each team has to play nine teams twice, and six teams just once. This creates an inherently inequitable draw by design. But how uneven is it this year, and who benefits from it? The tables below take 2015 records and tells us the strength of each team’s 2016 draw (both standard win-loss and Pythagorean win-loss), expressed in its impact on expected wins versus a fair draw.

First, here it is based on win-loss records:

Strength of Schedule – Win-loss records from 2015 applied to 2016 draw

IMAGE$3C0BBC5ADEB41BD4.jpg

And then on the pythagorean expectations calculations:

Strength of Schedule – Pythagorean win-loss from 2015 applied to 2016 draw

IMAGE$34B97EDBB87DC2C.jpg

New Zealand is handed a terrible draw this year, with very few repeat games with the 2015 cellar dwellers. On the other hand, Canberra and Gold Coast have been given a relatively easy run, starting out with an about 1.5 win head-start on the Warriors on draw alone (given equal strengths of their own teams).

Influence of “lucky results”

As outlined by the great Bill Barnwell in his analysis of NFL results, games decided in that code by less than a score generally can considered to be coin-tosses when it comes to the final outcomes. A team that wins a large share of its close games in one year has registered an extreme or outlier result and is relatively unlikely to repeat it next year, and vice-versa with a large number of close losses. The simplistic (and not always correct) term for this tendency for an extreme result to be closer to the average next time is called “regression toward the mean”.

On the assumption that a team should win around half of their close games, we’ve adjusted their prediction for those results. Here’s how each team did in close games last year and what happens if their luck regresses to the mean instead:

IMAGE$5B566349C8743F00.jpg

Interestingly the Roosters probably should have won even more games last year than they did. However, it’s the Green Machine that gets the biggest benefit from this measure, having lost a slew of games by a try or less. The Cowboys and Sharks slide down the ladder on this measure, having won several more close games than they lost.

Summary

This method of prediction isn’t perfect – it’s quite simplistic in nature and is probably best used as an indicator of how teams will go as compared to last season, rather than a holistic view of every factor inputting to a team’s final performance.

According to our model, the Roosters shouldn’t do as badly as most punters think, the Raiders should be a bolter into the eight, and there will be a logjam at the edge of the finals. The model seems to indicate that there is a clear top tier (Roosters, Brisbane, North Queensland and Melbourne), followed by eight teams separated by very little.

If you want to look at potential holes in the model, they would be centred on the high turnover of personnel at Gold Coast, Manly, Parramatta and Newcastle, which makes their performance a little more unpredictable. Note that none of those four teams made the finals last year.

One last time – the HPN prediction of the 2016 NRL ladder:

IMAGE$38CF0B197B90A46A