Annual Reports are crucial corporate documents that should reflect the financial and organisational health and achievements of a company. Only recently, in Australia, have Annual Reports been assessed for indications of occupational health and safety (OHS) other than fatalities. Some of that analysis of injury data has appeared in an article in the Safety Science journal entitled “Safewash! Risk attenuation and the (Mis)reporting of corporate safety performance to investors” – an article that deserves careful consideration.
According to the article’s abstract, the authors – Sharron O’Neill, Jack Flanagan and Kevin Clarke – found
“…. firms in hazardous industries provided significantly higher rates of disclosure, in terms of both number of performance measures and frequency of reporting, than firms in less hazardous industries. Yet higher disclosure did not translate to adequate reporting on measures needed to demonstrate the prevention of serious injury. Instead, there was a strong reliance on variations of highly aggregated frequency rates often criticised in academic and practitioner safety literature.” (page 114)
Annual Reports are increasingly part of the state of knowledge of the OHS profession, OHS regulators and the community about workplace safety matters and the organisational cultures of, particularly, large listed companies.
The Corporate Social Responsibility (CSR) movement, primarily in the United States, generated many of the formats and reporting concepts in Annual Reports several decades ago, and led to some of the supplementary reports mentioned below but CSR never caught on to the same extent in Australia. CSR is discussed further in the research report.
The quote above shows that even in large corporations, high consequence injury prevention is not the focus of their official public statements. Many of the companies assessed would profess, “safety first” or “safety is our number one priority” or “zero harm” while trying to justify these clichéd slogans with unreliable “aggregated frequency rates”.
Injury data is increasingly accepted as an ineffective measure of safety performance, accepted by the researchers, yet the content of Annual Reports continue this focus.
The research suggests confusion in the corporations’ OHS reporting:
“… evidence appeared to demonstrate strategic efforts to reduce the visibility of high-consequence safety system failures over time, thereby attenuating investor perceptions of occupational safety risk.” (page 114)
Although Annual Reports are expected to include far more OHS information than in previous decades, it seems that companies continue to perceive these reports in terms of financial marketing. It is assumed that readers are only looking for information on a company’s financial performance but this is not the case. Those investors who assess performance based on a much broader definition, according to the quote above, are being provided a skewed “perception of occupational safety risk.”
In a July 2015 SafetyAtWorkBlog article about Citi’s Safety Spotlight report, I wrote:
“It is useful to note that 94 of the 127 companies …. reported LTIFR and/or TRIFR as indicators of safety performance. Citi acknowledges that TRIFR “may be a better measure of harm.”
The need for a consistent metric is shown in the report’s footnotes by the plethora of frequency rates. A sample includes
- All Injury Frequency Rate
- Recordable Case Rate
- Recordable Injury Frequency Rate
- Recordable Case Injury Frequency Rate
- Lost Workday Case Frequency Rate
Some of the titles may be exactly the same metric but even when looking at Annual Reports it is difficult to tell. Frequency rates are, largely, a lag indicator but even consistency in these measures would greatly help the comparison of OHS performance.”
O’Neill, Flanagan & Clarke report on the shareholder criticisms of BP following the Deepwater Horizon disaster and quote a report from the investor coalition:
“The lack of key performance indicators and benchmarks to measure progress towards addressing risk make it difficult to judge the effectiveness of the steps the company has taken to date, or to gauge future preventative measures. . . The board must create and implement long term initiatives that are robust and substantive, with regular and consistent public reporting that enables shareholders to benchmark progress and assess performance.” [CBIS, 2011, pp. 2, 7, emphasis added]” (page 115)
The researchers also report on criticism that removes OHS performance data from the Annual Report into supplementary reports dealing with “sustainability” or other concerns that a business may see as secondary to the financials.
The research report offers some structure to those safety professionals in charge of corporate OHS performance measures. The researchers focus on three elements:
- Occurrence measures,
- Productivity measures, and
- Severity measures.
(The latter is supportive of the undervalued Class I, II & III categories designed by Geoff McDonald)
The researchers say that
“Together, occurrence, productivity and severity measures offer a complementary suite of injury performance information for investors. Just as productivity measures alone give little insight into either the prevalence or severity of work-related injury, similarly, a single severity rate provides no insight into either the number of injury occurrences or, critically, instances of damaging and costly Class 1 injury.” (pages 116-117)
OHS professionals could benefit from understanding the “readership” of their workforce and how they read the communications in newsletter, pre-start and toolbox talks as well as the changing and broadening readership of the annual reports. It is likely that such analysis will provide more “valid” safety performance data that will increase workers’ ownership or safety.
It is hoped that the productivity perspective in the research report was considered by Australia’s Productivity Commission in its recently completed deliberations on workplace relations.
The practical application of the research findings included three main points:
“…the quality of non-financial performance information was generally poor as evidenced by inconsistent terminology, a lack of stability, reliability and completeness of reported performance data and a lack of consistency and comparability in both the choice of reported measures and their underpinning measurement
methodologies..” (page 125)“…..contemporary injury accounting methods appear to aggregate, rather than distinguish between, high severity (high consequence) and low severity (low consequence) incidents, effectively ‘smoothing’ reported variations in injury performance and allowing managers to attenuate stakeholder perceptions of OHS system/control failure and OHS risk.” (page 125)
“…the research provides evidence supporting the potential for mandatory reporting to significantly improve the quality of corporate OHS disclosures to external stakeholders.” (page 125)
The last point will be a red flag to many corporate executives “Mandatory reporting? WTF!” But most OHS regulatory requirements have been generated as a result of inactivity or prognostication by companies and business operators on safety matters. Establishing nationwide uniformity of OHS performance measures, an issue on the table for several decades, would have prevented the suggestion of mandatory reporting ever occurring. Perhaps this research will generate the spark for a renewed industry focus on clear, effective and consistent performance indicators.
This research report deserves careful consideration and broad discussion, particularly while it is available on-line for free. It echoes many of the performance and reporting issues that this blog has reported on over the last couple of years but with much more depth and authority.