Using Metrology Fundamentals in Calibration to Drive Long-Term Value

Posted by Chuck Boyd on May 30, 2018 12:28:26 PM
Find me on:

Power-plant_reflection_1024px-602996-edited

 

This article discusses some critical items to address for a calibration program based on sound metrology fundamentals without a complete overhaul of the calibration program. Having properly calibrated process control instrumentation provides a high quality of process control, a process that will operate to design specifications, and prevents the process from being stressed as it compensates for inaccurate measurement data feeding the DCS. Realization of these benefits may be challenging to quantify and attribute to implementing any of the suggested changes, but conversely, implementation of the changes should not be extraordinarily burdensome on resources.  

Introduction

The science of metrology is seemingly calm on the surface but has extensive depth of very technical concepts. Metrology is the science of measurement and incorporates aspects of many diverse fields such as mathematics, statistics, physics, quality, chemistry and computer science-all applied with a little common sense. Because metrology work is interspersed with other job duties, many rely on knowledge of metrology, but the science is intimately understood by only a small percentage. Most often a diverse educational background is found across the maintenance stakeholders in a power plant and most, if not all, of the metrology knowledge, is learned on-the-job.

Many times calibration programs are based on the minimal definition of calibration, which is comparing an instrument’s measurement to a known standard, followed by documentation of the results. With the lean staffing levels typical in for example power plant maintenance groups today, it’s natural for these programs to evolve out of expediency. This expediency, and the lack of defined metrologist roles on these staffs, inhibits the development of a program that includes additional metrology fundamentals above and beyond what is needed to get the job done.   

The typical Electrical & Instrumentation (E&I) Manager has responsibility over a wide range of electrical equipment and instrumentation to support operations of the plant. The Manager’s purview includes management of E&I department staff, safety and environmental leadership, preventive maintenance, predictive maintenance and critical repair activities, and working with internal and external groups on special projects among many other areas of accountability. While instrument calibration is a critical activity, completion of the task requires a relatively small amount of focus, all else being considered. There are instrument calibration requirements critical to maintaining compliance with environmental regulations defined by the Environmental Protection Agency such as Mercury and Air Toxics Standards (the MATS rule) and regulation of greenhouse gases (GHGs) under the Clean Air Act, employer responsibility to protect workers from hazardous conditions defined by the Occupational Safety and Health Administration, and reliability standards defined by the North American Electric Reliability Corporation. Of course, nuclear-fueled generators must comply with additional regulatory requirements, but outside of complying with these requirements, natural gas and coal fueled generators are self-governed with regard to the balance of their instrumentation.

At a minimum, a competent calibration program insures instruments are calibrated on a schedule, the results are documented for audit purposes, and there is traceability. A competent calibration program helps maintain safety and production, but calibration is also a matter of profitability. Instruments measuring more accurately can improve safety, allow increased energy production, and reduce the stress on equipment. Unfortunately, the nature of the benefits presents a major challenge; unlike a metric such as labor savings, these benefits are extremely difficult to quantify for use in justifying the cost of change.

When instruments are calibrated with a traceable standard and the results are documented, many consider this to be adequate and no change is necessary. This position is bolstered by the very nature of how maintenance is scheduled. Months of planning go into an outage and when time to execute arrives, challenged with tight resources and tight schedules, the work must be accomplished as expeditiously as possible. All unnecessary steps must be eliminated as to not jeopardize the outage schedule. Therefore, adding steps to the process is counter-intuitive to this, which may be necessary to improve the process. E&I leadership must have the foresight to implement strategic change in order to realize the benefits of improvement.

Implementing metrology-based principles does not have to be a dramatic change. A substantial positive impact to quality can be realized by making some adjustments and tweaking the existing calibration program. These changes are easy to implement and simultaneously will reinforce a culture change focusing more on metrology aspects of calibration. Metrology as a science has an immense number of elements to consider, but initially focusing on the following areas will provide huge strides in building and maintaining a calibration program that provides confidence in measurement accuracy for process control instrumentation:

  • Measurement tolerance and pass/fail determination
  • Test strategy including hysteresis
  • Maintaining acceptable Test Uncertainty Ratios
  • Securing information assets

 

Measurement tolerance and pass/fail determination

The calibration tolerance assigned to each instrument is the defining value used to determine how much measurement error is acceptable. This subject is one that should rely heavily on the requirements of the process and not by looking at what the instrument is capable of performing. Ideally, the tolerance is a parameter that is set in process development where the effect of variation is measured. Unfortunately, there is no hard-and-fast formula for developing a tolerance value, it should be based on some combination of the process requirement, manufacturers’ stated accuracy of the instrument, criticality of the instrument and intended use. Care should be taken not to set a range too tight as it will put pressure on the measurement to be unnecessarily accurate.

Tolerance can be stated as a unit of measure, percentage of span, or percentage of reading. It is critical during calibration to mathematically calculate the error value in order to determine pass/fail status. This calculation is an additional step in the process and particularly with tolerances defined as a percentage of span or reading, mathematical calculations invite opportunities for errors. As calibration programs evolve this aspect of the calibration process will get relegated to the calibration technician’s discretion. There have been occasions where the decision on pass/fail is relegated to technician experience, or gut feel, or asking another technician for his input. This is a practice that provides questionable results and although the resulting calibration certificate may show the measurement is within tolerance, the instrument is recorded as within tolerance in perpetuity when in fact this result was not mathematically confirmed. More importantly, plant operators could be making decisions based on wrong data. This method of determining pass/fail should not be allowed and enforced either procedurally, which should require recording of the error limits as well as the calculated error, or enforced programmatically, having the inputs entered into a computer-based system where the pass/fail is indicated automatically.  

 

Chuck pass fail determination

 

Test strategy including Hysteresis

Hysteresis errors occur when the instrument responds differently to an increasing input compared to a decreasing input and is almost always caused by mechanical friction on some moving element. (See Figure 1) These types of errors rarely can be rectified by simply making calibration adjustments and typically require replacement of the instrument or correction of the mechanical element that is causing friction against a moving element. This is a critical error due to the probability that the instrument is failing.

 

Hysteresis_picture-1

 

Most calibration test strategies will include a test point at zero (0%) and a test point at span (100%), and typically is at least another test point at mid-range (50%). This 3-point test can be considered a good balance between efficiency and practicality during an outage. The only way to detect hysteresis is to use a testing strategy that includes test points up the span and test points back down the span. Critical in this approach is that the technician not overshoot the test point and reverse the source signal, approaching the test point from the wrong direction. Technicians should be instructed to return to previous test point and approach the target point from the proper direction.  

 

Maintaining acceptable Test Uncertainty Ratios

Measurement uncertainty is an estimate of the error associated with a measurement. In general, the smaller the uncertainty, the higher the accuracy of the measurement. The uncertainty of the measurement standard (i.e. – calibrator) is the primary factor considered along with potential errors introduced in the calibration process to get an estimation of the calibration uncertainty, typically stated at a 95% confidence level (k=2). The comparison between the accuracy of the instrument under test and the estimated calibration uncertainty is known as a Test Uncertainty Ratio (TUR).  

The instrument measurement is within tolerance if the uncertainty of the standard used to calibrate the instrument is known. Once the tolerance is defined, a good rule of thumb is that the measurement standard does not exceed 25% of the acceptable tolerance. This 25% equates to a TUR of 4:1; the standard used is four times more accurate than the instrument being checked. With today's technology, a TUR of 4:1 is becoming more difficult to achieve, so accepting the risk of a lower TUR of 3:1 or 2:1 may have to be considered.

Another challenge in many plants are legacy measurement standards. These are standards with acceptable measurement uncertainty compared to the process control instruments of its day, but have very low ratios today. These standards have not been replaced throughout multiple automation system upgrades over the years. Automation suppliers continue to evolve technology yielding more and more accurate measurement capability to the point where some plants may struggle to get a 1:1 TUR, or less. It should be determined if the standards used in the calibration program are fit for purpose by confirming the unit’s uncertainty, defining tolerances, and using the two values to mathematically calculate TUR. This exercise will provide the confidence that the standard being used is sufficient for the measurement being made.

 

Chuck TUR-931099-edited

 

Securing information assets

Information is one of the company’s most valuable assets, but less so if activity is not fully documented, organized efficiently, and easily accessed by those that need the information. Not meeting these characteristics carries a cost to the business in the amount of time and resources to gather the information, and the errors/mistakes made due to inaccurate, incomplete, or outdated data. These costs are magnified with regard to calibration data where defined metrology-based parameters directly impact the quality of process control. For a sustainable calibration program, there must be a point of reference to serve as a definition or benchmark for applying metrology principles.

Harnessing this type of information should be a top priority as the metrology data clearly provides competitive advantage in higher quality calibration work and higher efficiency in execution. An exacerbating circumstance for this issue is the loss of personnel, who are responsible for development and management of the calibration program and in possession of the knowledge base. These losses occur when they leave the company or make internal job changes. This includes the phenomenon of the aging workforce such as baby boomers leaving the workforce at an accelerated rate. With the exit of experienced, skilled workers, critical knowledge will slowly be drained from E&I groups. The industrialized world is transitioning into what is known as the knowledge economy; the concept that success is increasingly based on effective utilization of knowledge as the key resource for competitive advantage. With the attrition of highly skilled workers requiring the replacement with much less experienced workers, this knowledge will be critical in getting them productive as quickly as possible.  

 

Conclusion

The novelty of calibration and metrology alone has inherent complexities. Metrology is a specialized and highly technical subject, and metrology subject matter experts make up a fraction of the overall population of maintenance personnel in the power generation industry. Whereas the desire to maximize electrical output requires maximum availability, reliability and efficiency of the plant, the drive in the industry is to reduce costs in part by running lean with little chance of a dedicated metrologist role. The health and accuracy of measurement and control devices directly impacts the plant’s reliability and up-time, so the resolve to make quality improvements to the calibration program is justified.

Transforming the calibration program doesn’t have to be a resource intensive, immense undertaking. In the absence of a dedicated project, formally managed and resourced, implementing a high-performing calibration program progressively by strategically focusing on specific weaknesses is possible. The effort will require dedicating some time on behalf of select E&I stakeholders to see the initiative through, but weak areas of the metrology program and corrective actions can be documented to show progress.

The specific subject areas highlighted in this paper were selected because they are often overlooked, based on Beamex experience working with various power plants. Corrective action taken on these areas will provide solid strategic improvement in measurement accuracy and enhance the plant’s ability to control its process within design limits. Failure to address these areas will continue the plant on a trajectory that will incur avoidable cost due to additional stress on the plant and lost revenue due to substandard heat rate.

 

Download this article as a pdf file by clicking the picture below:

Metrology fundamentals in calibration to drive long-term value - Beamex White Paper 

 

Related blog articles

Other blog articles related to this topic you might be interested in:

Please browse the Beamex blog for more articles.

 

Related Beamex offering

 Please check the Beamex web site for products and services we offer to help you.

 

 

Topics: Calibration process

Chuck Boyd

Written by Chuck Boyd

Chuck Boyd, Business Development Director, has been in software development, system implementation and business development roles for the past 30+ years. He has been in Business Development and Client Advocacy roles for the past 20 years, the most recent 10 years with Beamex managing key global and strategic accounts, market development in various regions of the U.S. and currently strategic market development within the Power and Energy sector. Chuck holds a Bachelor of Science degree in Computer Information Systems from DeVry Institute of Technology and previously spent over 10 years as a Business System Analyst on software development projects. In this role he functioned as a communication conduit between business and IT, working with key executives and users to translate business

Back to top

Leave a reply