Since the release of the 2015 Citi report into the occupational health and safety (OHS) performance of the companies in the ASX200 stick exchange rankings, this blog has received many requests for a copy of the report to assist in the benchmarking of performance. Clearly performance indicators for OHS remain contentious and difficult but this does not need to be the case.
Citi’s recent report stated that key performance indicators (KPIs) should meet three needs:
- “internal monitoring for continuous improvement to reduce incidents;
- benchmarking and sharing lessons within the industry; and
- transparent disclosure to stakeholders.”
The first point – continuous improvement – is an established element of OHS and has been emphasised in OHS management standards for many years. The second reflects the need of corporate executives to compete and, where possible, build a collegiate knowledge base. The third point is likely to be the sticking point. The type of information provided to shareholders is well-established and relies on production, profit and governance. Introducing OHS into this consideration is problematic as it not only introduces new information, data and metrics but it also will change the interpretation of the traditional measures.
This change is an essential growth in the maturity of the company but, more importantly, in the maturing of the relationship with shareholders.
The quest for useful OHS KPIs has been long. In 2001 Bureau Veritas and the Safety & Environmental Risk Management (SERM) Rating Agency Ltd launched its BV-SERM rating process. In a synopsis provided to SafetyAtWorkBlog at the time, the companies said that
“Companies however, see safety, environmental and social performance benchmarking as a painful process – especially when it highlights deficiencies in management effectiveness, exacerbated by the lack of credible internal data and poor collection systems.
A recent workshop on ‘Benchmarking – painful or beneficial’, run by Bureau Veritas and SERM Rating Agency as part of AccountAbility’s ‘Benchmarking Conference 2001’ (in March 2001) showed how companies – whether large listed or SMEs – also feel pain when benchmarking damages their market strength such as reduced sales when customers switch suppliers, being ‘screened out’ from investment portfolios, and possibly being ‘whitelisted’ (removed from the list of bidders for new contracts).” (links added)
These fears and suspicion remain for small and medium-sized businesses and need to be recognised as legitimate fears by OHS professionals. The synopsis believes the BV-SERM rating process can remediate these fears.
“…. successful benchmarking must capture better performance, by highlighting trends and how a company’s operating environment is evolving. It should also encourage better stakeholder engagement, based around ‘dialogue-decide-deliver’ rather than ‘decide-defend-deny’. Equally, performance benchmarking should guide internal buy-in and acceptance that change is necessary, a crucial precursor to generating continuous business improvement.
Benchmarking at its best can truly facilitate business improvement by enabling ‘best practice’ to be identified, measured and emulated. It can help companies by:
- focusing on their external business environment;
- improving process efficiency;
- promoting a climate conducive to change; and
- encouraging performance goal-setting.
But it must also be ‘fit for purpose’ such that the outputs provide companies with a generally accepted ‘health-check’ of their progress towards corporate sustainability.” (formatting changed for readability)
That this was stated in 2001 shows just how slow progress has been in this area. [The phrase “‘dialogue-decide-deliver’ rather than ‘decide-defend-deny’” is particularly attractive summary of OHS aims and management flaws]
Australian companies should witness a quicker pace of change in the OHS performance reporting due to the due diligence provisions in the Work Health and Safety (WHS) laws.
The four bullet points can be taken, in more recent OHS terminology, as, respectively:
- looking at the societal context in which the company operates or in which the services or goods are provided;
- productivity, safety, health and wellbeing, as well as continuous improvement on operational processes;
- incorporate a change management strategy. Some may emphasise Leadership, others would push Innovation; and
- develop goals and performance measurement based on lead indicators.
In 2005 the Office of the Australian Safety and Compensation Council released a Guidance on the Use of Positive Performance Indicators (PPIs). It was of its time and encouraged companies to use PPIs but failed to specify which PPIs should be applied as a standard for performance and comparison for Australian companies. Many OHS regulators are nervous about stipulating OHS requirements as it can be interpreted as over-riding the employers’ OHS obligations and legislative compliance but in the area of comparable OHS performance measures, such specificity is required.
Identifying specific KPIs or PPIs has no impact on OHS compliance. What it does is provide consistent, valid and readily understandable performance measures. This offers strong benefits to shareholders, stakeholders, OHS regulators and others in the specific industry sectors. In the terms of the some of the fears expressed in the quotes above, the response should be “so what?”
Listing a company’s OHS performance indicators allows for that company to express pride in its safety achievements. Those who are not at the same level are provided a goal that they can strive to match or better. This would also provide some legitimacy to the, currently, tired and lazy nonsense of “best practice”. Everyone could see local examples of “best practice”.
Once OHS KPIs are specified by the, hopefully National, OHS regulator, companies should be offered a period of, perhaps, five years to amend their performance management systems. The Government could also consider the mandatory reporting of these standardised OHS KPIs. This may seem obtrusive but mandatory reporting would ensure co-operation and this co-operation could be described as part of the economy-wide due diligence to OHS.
The recent award for Annual Reports provides optimism on OHS performance reporting but even those annual reports analysed varied in both how the information was presented and on the level of information provided. The momentum for improvement in OHS performance measurement and reporting is clear. The last few analyses by Citi illustrate the financial imperatives, the increased attention to OHS reporting in Annual Reports indicate a more gentle but still effective pressure. Both of these changes would be helped by some clear parameters advocated (mandated?) by OHS regulators.
This is a complex area but not something that will resonate with many smaller organisations where risk is often greatest. This is the issue unless safety-related “PPIs” trickle down the supply chain and through the contractor network to inform those smaller organisation that sit in that space.
Interesting that the term “continuous improvement” keeps coming into the discussion. Obviously TQM principles speak to productivity and the bottom line. It is something that many organisation know because it speaks to productivity and the bottom line. TQM meets OHS? Now that’s a thought. Hardly a new one but a good entry point for safety performance as OHS struggles to find its rightful place in business.
Having a general consensus on useful positive performance indicators (PPI) could be handy – not sure having mandatory ones is good. There is unlikely to be much disagreement about how good it is to have clever PPIs (I would have thought). But for mine, the PPIs need to be nicely aligned with need, if only because measuring the PPI is gunna cost something to measure. Also I reckon it’s important for the whole enterprise to see the bonuses of a PPI (i.e. managers and staff). Going for a mandatory set may not deliver good things.
F’rinstance, what about a company that has super diligent responses to incident reports, but fails to have a database clever enough to interrrogate those responses such that they can be used for a PPI? The company (including staff) make their call on not measuring the efficacy of the incident responses, but a wee bit of analysis shows top of the class responses. Should the company get “marked down” because they failed to have a proper PPI sorted? I’m inclined to think not.
Roger, losing ready access to brownie points for doing well on incident responses ain’t too clever. And yes, auditors like to have “happy numbers” at their fingertips, but surely what matters the most is the job is getting done properly? (This is all conjectural. Have had punters who are doing great with stuff like incident responses but they are just don’t have the resources to get a PPI monitoring system together.)
Col Finnie
finiOHS