Why would software developers think that business leaders want a dashboard like the one on their racing car, speedboat, aeroplane or moped?
Just because these vehicles also have something called a ‘dashboard’ doesn’t mean that this analogy is transferrable to a business setting.
Rotational speedometers like the one below are the worst culprits. This example is from an IT Management Software tool. The business leaders here might be CIOs but these dials could be comparing any measure with a target specification.
Just look at how realistic the needles are, how the 3D dials pop off the page, and an digital LCD read-out as well. Wow, sexy!
Except this visualisation could be better in at least 10 ways:
- The use of circular dials consumes valuable screen or page real estate.
- The life-like rendering of the dials is pure ‘chart junk’ which doesn’t communicate any data.
- The dominant use of red on the dials diverts the eye from the measure and would make everything look ‘bad’ even without any exceptions.
- Both an analogue & digital read-out are a duplication.
- The leading zero on the digital readout is superfluous and the two decimal places is an improbable claim of accuracy.
- Incredibly there are only 9 segments for every 20% of the measure. This means that each ‘tick’ on the scale is 2.22% making it impossible to read the measure or the target thresholds.
- We can’t directly compare rotational angles to see how one SLA is better than another.
- A comparison of any value against a crude threshold doesn’t take into account natural variation of the system the measure is observing.
- The reporting of % SLA compliance obscures the direct evidence of the underlying measure and an implied target eg. The % of Transactions with an Order to Cash time of less than 60 Days.
- There’s no sense at all of how this measure is changing over time.
Some of these things we can fix visually. Here’s a quick example using the same data.
Each measure is placed on a bullet chart as advocated by Stephen Few. This packs multiple measures together for higher density display and allows direct comparison between them.
With a simple ranking we can now see that Order to Cash is the ‘worst’ in this snapshot. There’s minimal use of colour – just a red target line for the measures falling below it – and we can see at a glance that this is currently all of them.
The business leaders still don’t know whether these services are getting better or not. Even if Internet Banking exceeds the SLA in the next period we wouldn’t know whether this was luck or not and what is likely to happen in the future.
We don’t know whether the blanket SLA targets are achievable or what these are based upon for each service. These insights are entirely masked by a simplistic story of ‘SLA Failure’. Could this judgement have consequences for a service provider? If so we can imagine how that might distort the narrative and lead to a hunt for external causes rather than seeking shifts in systemic performance.
Some things can’t be fixed visually without more information such as the underlying SLA measure or time series data. Some are symptoms of a ‘contractual specification’ mindset such as the point comparison of % SLA achievement against an arbitrary target.
Should leaders be expected to steer their strategy by looking at a visual car crash?