The opening round of Super Rugby this last week end was a classic example of how business intelligence fails.
Predicting the first round of any sporting event is always a bit of a gamble – largely because one is looking at historical data.
In many cases this can be a sound strategy. In the absence of form, which cannot be demonstrated until the players are on the field, I made my predictions based on the two two principle factors — the hypothetical strength of the respective teams on paper, and, where this was inconclusive, on which team had home ground advantage.
What a mistake!
Historical form had no role to play in the weekends results.
Game one: The seven times champion Crusaders with multiple All Blacks were whipped at home by perennial wooden spoonists, the Rebels.Only 2% of punters on Superbru predicted this result.
This set the tone for the weekend..
The Lions conspired to loose at home to the Hurricanes – predicted by only 35% of punters.
The highly fancied Sharks lost at home to a (on paper) much weaker Cheetahs team in a result predicted by only 6% of punters, while the Bulls were outplayed at home by the Stormers in another upset (31% predicted this).
Two games out of the seven played ran to form – with the classy Brumbies and Chiefs winning their matches against the reds and the Blues – the only two games that I predicted correctly. The final game of the weekend saw last years champions, the Waratahs, humbled at home by a ragtag Force team in a result predicted by only 6% of punters.
Too often business intelligence fails due to missing or inaccurate data., faulty assumptions or previous bias.This weekends Super Rugby was a classic example of historical track record being insufficient to predict the results. [Tweet this]
Is the data you use for business decision making current, complete and accurate?
Image sourced from http://en.wikipedia.org/wiki/List_of_Super_Rugby_champions