Monitoring Asset Management Strategy Execution with KPIs

April 21, 2015
Using the ISO 55000 standard for reference, this article outlines how KPIs can be effectively deployed so that the asset management organization will not simply be efficient, but also effective for the greater organization. 

Key performance indicators (KPIs) are recognized by most industry practitioners as the motivator of an efficient, data driven organization. Rightly so, however KPIs are often developed arbitrarily and perhaps even without considering the effect on the greater organization or alignment to the organization’s strategy and objectives.

For the indicators to be truly valuable, they must be meaningful and applicable as well as tied to organizational objectives and ultimately an organizational strategy. Michael Porter, renowned strategy professor at the Harvard Business School, wrote that operational effectiveness is not a strategy[i]. What he means is that high performing organizations, those that deliver best-in-class KPIs, often fall short because that performance is not tied to an effective organizational or business strategy. His point is that efficiency for its own sake, or efficiency without strategic effectiveness, is not a sustainable effort.

So what does this have to do with asset management? In January 2014, ISO 55000[ii] was introduced as the defining international standard that provides a framework for asset management systems. The core of the standard is to require asset management organizations to align activities to objectives and objectives to strategies and measure performance accordingly. It seems the writers of the standard understood Porter’s sentiments.

This article outlines these relationships and explains how KPIs can be effectively deployed so that the asset management organization will not simply be efficient, but also effective for the greater organization.

The Framework

Terms like strategy, objective, plan, and policy are at risk of being ambiguously applied depending on the organization and the situation. Sometimes, even within organizations, one group’s strategy is another’s objective. So before understanding the relationships we must first outline a consistent approach and application. Conveniently, from an asset management perspective, ISO 55000 provides a framework for this.

The figure below represents the elements outlined in the ISO standard. To be clear, the organizational objectives are those of the company as a whole. These are the objectives of the business. The strategic asset management plan (SAMP) is a documented plan that “specifies how organizational objectives are converted to Asset Management objectives.” This is the action plan of how the asset management function will support the company’s goals and objectives. Feeding the SAMP are the asset management objectives, asset management plans, asset management policy, and performance evaluation.

Asset management objectives are simply the objectives of asset management as required to support the SAMP. The asset management plans are the action plans to meet the objectives, and the asset management policy is closely related to an organizational mission. Per ISO, it is the “intentions and directions” of the asset management function to meet its objectives. Performance and evaluation is the act of measuring, monitoring, and analyzing the performance of the asset, the asset management function, and the asset management system. This is not only technical performance of the physical asset portfolio, but also the performance of the processes within the asset management system.

The asset management system is outlined as all of the business processes and interactions that take place within the organization related to asset management. This includes both internal asset management processes as well as the relationships to other business functions within the company.

For a KPI to be meaningful, it should contain, at a minimum, these four components: objective, source, performance criteria, and action plan."

Clearly, within the ISO standard there is a well thought-out framework to ensure that asset management activities are specifically aligned with company objectives. This is a real source of value from the standard as it constrains the asset management function from diverging from the greater organization. This framework ensures that the performance being indicated, measured, and analyzed is that which is required to meet not only the asset management objectives, but also the objectives of the company.

Key Performance Indicators

Now that the role of Key Performance Indicators (KPIs) has been established within the context of ISO 55000 and asset management functions, it is important to more comprehensively discuss their tactical design and application.

KPIs can be defined as “a set of measures that help managers evaluate a company’s…performance and help spot the need for change…”[iii] These measures act as the manager’s insight to the organization’s progress towards meeting its goals set out by the strategy and objectives.

For a KPI to be meaningful, it should contain, at a minimum, these four components: objective, source, performance criteria, and action plan.

Objective: Derived from the SAMP and ultimately the organizational objectives, this is the objective or result that is desired. The KPI must clearly measure performance as it relates to meeting or achieving the objective. A best practice is to explicitly define this relationship when reporting the results of the KPI because it will ensure that all stakeholders understand the significance of the metric.

It is also important to note that the relationship between objectives and KPIs is not necessarily 1:1. For example, if an asset management objective is to meet a certain cost per unit threshold, there may be many KPIs that monitor performance such as overall maintenance spend, contractor spend, first pass quality, and OEE.

ISO 55001 Section 9.1 provides further guidance. It states that organizations “shall evaluate and report on asset performance, asset management performance including financial and non-financial…, and the effectiveness of the asset management system.” In other words, organizations must measure objectives related not only to technical performance of physical equipment but also the performance of the supporting activities and processes necessary to meet the requirements of the SAMP and ultimately deliver value to the organizational or company objectives.

Performance Criteria: The KPI should have clear performance criteria, or definitions. Stakeholders must understand what constitutes success and more importantly at what point a deviation requires action. Again, this must be specifically tied to the objective the metric is evaluating. The goal is to meet the strategic objective and deliver the results, not achieve arbitrary “world class” or benchmarked results. The upper limit of the performance criteria should be realistic, but also aggressive enough to ensure the objectives will in fact be met if the metric remains satisfactory. The lower limit should provide enough opportunity for managers to implement actions to recover performance without falling short of the overall objective. If the lower limit is set too low, by the time action can be taken it may be too late to adequately recover.

ISO 55001 Section 9.1 states that “the organization shall ensure that its monitoring and measurement enables it to meet the requirements.” This is outlining that performance levels for KPIs are there to ensure strategic objectives are met.

Source: KPIs are only as robust as the data and information from which they are created. Each KPI should have a clear data integrity review and a plan to ensure it is based on accurate information. Additionally, each KPI should have a clear owner with responsibility to update, monitor, and document.

Section 9.1 states that “the organization shall retain appropriate documented information as evidence of the results of monitoring, measurement, analysis and evaluation.”

Reporting, Actions, and Analysis: The metric must be visible to all of the necessary stakeholders in order for actions and analysis to take place. Therefore results should be reported on and discussed with all of the necessary stakeholders, both directly and indirectly involved in the management of the metric. However, simply monitoring the KPI provides no actual value to the organization. It is the actions taken to maintain compliance and correct deviations that ensure the objectives are met. There should be a specific plan of action with clear ownership for each deviation or variance in the metric. Further analysis to understand the source of the deviation should also be undertaken to ensure the root cause is understood.

ISO 55001 section 10.1 provides specific guidance on how organizations should manage non-conformities and continuous improvement as it relates to asset management performance.

Leading vs. Lagging Performance Indicators

When implementing KPIs it is critical to understand the difference between leading and lagging indicators. It is important to employ a combination of leading and lagging indicators to most effectively monitor the status of the intended results.

Leading Indicator: A leading indicator provides insight into the status of an objective or outcome prior to a change or disruption to that outcome. It provides advanced insight and is a predictor of results.

Lagging Indicator: A lagging indicator provides feedback on an objective only after the objective is either missed or met. It allows for analysis to improve performance and identify root causes of deviations but rarely will it provide opportunity to make adjustments in time to maintain desired outcomes.

An objective to meet a certain level of units produced per month can be used to illustrate the application of leading and lagging indicators. In order to produce the desired amount of units, the asset must be online for 95% of a production period. Equipment uptime with a threshold of 95% is an obvious KPI to manage and track performance to the plan. However, this metric only provides insight to the manager after the required 95% uptime is missed and subsequent loss of production. Therefore it is considered a lagging indicator.

Suppose the plant reliability engineer is responsible for ensuring and improving equipment uptime. The engineer may apply tools such as root cause analysis to identify corrective actions to minimize equipment defects. As improvements are made to the asset and additional failures are avoided, uptime and ultimately production output can be sustained. The manager can monitor corrective actions implemented as a predictor of future uptime and take action before the objective is compromised. Therefore it is considered a leading indicator.

Note the importance of monitoring both leading and lagging indicators. The leading indicator in the example is in reality only assumed to be a predictor of future performance. Perhaps, based on previous experience, there was a correlation to corrective actions and future performance. Most of the time it will be an accurate predictor, however can it be guaranteed that the appropriate actions have been identified or that the associated analysis was flawless? No, so it’s necessary to monitor the lagging performance of equipment uptime as well. The lagging indicator provides the most visibility to the status of the desired outcome. Employing both leading and lagging indicators enables a manager to be sure the right activities are in place to ensure the objective is met as well as become aware when the objective is in danger of being missed.

An Application

The importance of effectively aligning performance metrics to organizational and asset management strategic objectives is best illustrated with an example.

In this scenario, the asset manager is interested in benchmarking best-practice metrics for his maintenance organization. He has read books and attended a conference to investigate metrics and KPIs that are used in other organizations and industries. Finally, he decides that he would like to maximize daily maintenance schedule compliance to ensure the planning group was effective, track mean time between maintenance (MTBM) to monitor the performance of the preventive maintenance program, and reduce his contractor costs to reduce overall labor costs per unit, all well-established and utilized metrics. He decides on aggressive targets that are “world class” and implements a scorecard.

The first few months go by and he reviews the metrics with the plant manager. What happened? All are in the red. Not only that, during that time his new focus on daily schedule compliance resulted in conflicts with the operations manager over production scheduling. Additionally, he had to bring in contractors to help with opportunity work during operating schedule deviations which increased his labor cost per unit. Finally, his mean time between maintenance was greatly reduced, also as a result of the schedule deviations and opportunity maintenance.

How could the attempt to achieve “best practice” outcomes monitored by widely accepted metrics result in such conflict within this organization? In short, the outcomes desired by this manager were generic and not aligned to the strategic objectives of his company. High daily schedule compliance, high MTBM, and low contractor costs are all characteristics of high volume, low cost, and minimally diversified operations. However, organizational objectives at his company were low lead time, highly flexible, and highly diversified operations. In order to meet these objectives, the production schedule was very flexible and the operation was required to react to orders quickly to meet low lead time requirements.

The metrics put in place resulted in asset management activities and objectives opposed to the organizational objectives. To support the desired outcomes of the company, the asset manager should have focused on flexibility and metrics such as mean time to repair (MTTR) and moved schedule compliance to a weekly review as opposed to daily. In this case contractor costs may be an appropriate metric to monitor and manage, however the flexibility in labor required to support the operation may not allow for thresholds to be “best in class.”

Although this is a very simple example and in real life would be minimally consequential, it illustrates the importance of the interactions and alignment as prescribed in ISO 55000. Asset management functions are in place to add value to the organization, as defined by that organization, not to be efficient for their own sake.

Will McNett is a Senior Reliability Engineering Subject Matter Expert at Life Cycle Engineering (LCE). He is an accomplished maintenance and reliability professional with extensive experience in energy, manufacturing and mining. You can reach Will at [email protected].

[i] Porter, Michael. What is Strategy? (p. 1). Harvard Business Review

[ii] ISO 55000; ISO 55001; ISO 55002: 2014

[iii] Chase, Jacobs, Aquilano. Operations Management for Competitive Advantage. 2004. McGraw Hill

Sponsored Recommendations

Voice your opinion!

To join the conversation, and become an exclusive member of IndustryWeek, create an account today!