Beamex Blog

Beamex blog provides insightful information for calibration professionals, technical engineers as well as potential and existing Beamex users. The blog posts are written by Beamex’s own calibration and industry experts or by guest writers invited by Beamex.

https://2203666.fs1.hubspotusercontent-na1.net/hubfs/2203666/Stock%20images/Person%20on%20Computer.png

Calibration Process Savings Calculator [Online calculator]

Posted by Heikki Laurila on Aug 23, 2024

Blog size Customer service v2 (1)

Are you ready to uncover potential savings within your calibration processes?

In this blog, we’re introducing our new Calibration Process Savings Calculator - a powerful online tool designed to help you estimate how much you could save by upgrading to a modern, digitalized calibration ecosystem.

With decades of experience working with customers globally, we've honed our expertise to identify areas where significant savings can be achieved. A highly effective calibration ecosystem not only saves you time but also reduces operational (OPEX) costs. 

Once you complete the calculator, you'll receive an estimated monetary savings for both 1-year and 5-year periods. 

Naturally, a modern calibration ecosystem offers numerous other benefits beyond just time and money savings.

Access the Calibration Process Savings Calculator here >>

 

How does the calculator work?

The calculator includes questions throughout the calibration process, including the number of instruments and calibrations, work order generation, process instrument data management, transmitter types, calibration procedure management, scheduling of calibrations, calibration execution, documenting of calibrations, managing the calibration results, and so on.

For each question, you can select from predefined answer options. Based on your inputs, the calculator estimates your potential monetary savings. In the end, these savings are summed up to give you a clear picture of your total potential savings for 1 and 5 years.

Give the calculator a try!

 

Additional benefits


Upgrading to a modern calibration ecosystem offers numerous advantages beyond financial savings, including:

  • Reduces the risk of human errors related to manual work
  • Improves the quality and integrity of calibration data
  • Helps achieve regulatory compliance and audit readiness
  • Increases the safety and well-being of your employees 
  • Enhances process efficiency and reduces downtime
    Additional benefits

In many cases, these benefits can be more crucial than the savings themselves and might be the primary reason for updating your calibration ecosystem.

 

What customers say

 

 

Many companies have found a better way with Beamex – read their success stories here.

 

Contact our experts

Want to discuss your results and learn how to achieve these savings? Our calibration experts are here to help. Please contact us for more information.

Contact our experts >>

 

Calibration process savings calculator - Beamex

 

 

 

 

Topics: Calibration, Calibration process, Calibration software, Calibration management

Revolutionizing calibration services with technology - case story [webinar]

Posted by Heikki Laurila on Jul 04, 2024

In today's fast-paced world, managing calibration services efficiently is critical. At Douglas Calibration Services, they faced a significant challenge: a mountain of paperwork that not only slowed them down but also caused immense stress among their technicians.

During a recent webinar, Douglas Calibration shared their transformative journey from a paper-based system to a fully digital, cloud-based solution with Beamex LOGiCAL Calibration Management Software.

In this blog post, we’ll delve into the highlights of that webinar and how innovative software helped them reduce stress, improve efficiency, and achieve remarkable results. Below you can find an executive summary of the webinar and a link to the full webinar recording.

Watch the full webinar recording >>

 

Webinar agenda:

  • 0:00 - Aidan Farrelly from Beamex UK office starts the webinar, introduces agenda and presenters, and discusses the unique challenges with calibration service companies.
  • 10:20 - Case Story: Richard O’Meara from Douglas Calibration share their success story
  • 31:20 - Ville Lassila from Beamex Customer Success runs an online demonstration of Beamex LOGiCAL calibration management software
  • 41:30 - Antti Mäkynen, Product Manager at Beamex, discussed the feedback from service companies and the roadmap ahead.
  • 57:30 - Q&A session
  • 1:21:30 - End of recording

 

The Challenge: Overwhelmed by Paperwork

Douglas Calibration Services faced significant challenges with their traditional paper-based system. With over 90 technicians and more than 130 clients, the company was drowning in paperwork. Technicians struggled with the manual entry, duplication of work, and the physical handling of documents, leading to stress, inefficiency, and errors. The primary issues included:

  • Stress and Staff Retention: Technicians were overwhelmed by the backlog of paperwork, leading to burnout and high turnover rates.
  • Efficiency and Compliance: The manual process was time-consuming and error-prone, affecting the company's compliance and efficiency.
  • Client Satisfaction: Delays in processing and delivering results frustrated clients, impacting overall satisfaction.

 

Revolutionizing calibration services with technology

 

The Solution: Transitioning to Digital with Beamex LOGiCAL

Recognizing the need for change, Douglas Calibration Services embarked on a digital transformation journey. In 2013, they developed an initial access database to manage calibrations, which laid the groundwork for more advanced solutions. By 2022, they implemented Beamex Logical, a comprehensive software solution that revolutionized their operations:

  • Instant Data Access: Logical provided instant access to all calibration data, reducing delays and errors.
  • Seamless Collaboration: The platform facilitated better collaboration among technicians and office staff.
  • Paperless Operations: Transitioning to digital reports eliminated the need for physical paperwork.
  • Improved Efficiency: Automation of processes resulted in faster turnaround times and enhanced productivity.

 

Impact and Results

The implementation of Logical brought about significant improvements in various aspects of Douglas Calibration Services’ operations:

  • Staff Workload Reduction: The digital system reduced technicians' workload by 40%, allowing them to leave work on time without pending tasks.
  • Enhanced Compliance: Compliance improved dramatically, with error rates dropping from 3.5% to 1.2%, and the goal is to achieve less than 1%.
  • Client Satisfaction: Clients began receiving same-day results, increasing satisfaction and trust in the company’s services.
  • Operational Efficiency: Internal report reviews increased by 800%, and office staff could handle all digital calibration reports efficiently.
  • Positive Feedback: There was a significant rise in positive feedback from clients, reflecting the enhanced service quality.

 

Revolutionizing calibration services with technology

 

Key Takeaways

The journey of Douglas Calibration Services from a paper-based system to a fully digital, efficient operation offers several key lessons:

  1. Embrace Technology: Implementing the right software can transform operations, reduce stress, and improve efficiency.
  2. Focus on Staff Well-being: Reducing workload and improving processes can significantly enhance staff retention and satisfaction.
  3. Client-Centric Approach: Faster and more accurate service delivery boosts client satisfaction and loyalty.
  4. Continuous Improvement: Regular audits and validations ensure ongoing compliance and process improvements.

 

Conclusion

Douglas Calibration Services’ successful transition to a digital system with Logical showcases the immense potential of technology in streamlining operations and enhancing service quality. By addressing their challenges head-on and adopting innovative solutions, they set a benchmark for the calibration industry. Software, indeed, became an invaluable tool in their quest for efficiency and excellence. 

 

Watch full webinar recording

Thank you for taking the time to read about our journey. For a more in-depth look, watch our full webinar recording. We hope our experience inspires other businesses facing similar challenges to explore innovative solutions and improve their workflows. You can read Douglas Calibration case story here.

Watch the full webinar recording >>

 

You could save too!

As you saw, Douglas Calibration Services has significantly benefited from adopting Beamex LOGiCAL Calibration Management Software, particularly in terms of time and money savings. The transition to a digital system has streamlined processes, reduced errors, and enhanced overall efficiency. To see how much your organization could save with similar improvements, try our Calibration Savings Calculator.

 

Take the next step

If you want to discuss how calibration technology could could revolutionize your calibration processes, discuss with our calibration experts. They're here to help you. Please contact us for more information.

Contact our experts >>

Learn more on Beamex LOGiCAL Calibration Management Software >>

Request a demo to experience the Beamex LOGiCAL conveniently in an online meeting. 

 

More webinars

View our online webinar library here >>

 

 

Topics: Calibration process, Calibration software, Calibration management, Digitalization

How to get your boss to buy you a new calibrator

Posted by Heikki Laurila on Jun 19, 2024

Calibrator banner image

When you work with something, it's so much easier if you have the proper tools, right?

The same goes for calibration – if calibration is your job, you want the best tools to make your work easier and help you to get more done. Modern calibrators ensure that your calibrations are accurate, you have less to carry, are easy to use, there are automated functions, and so on.

However, when you ask your boss to buy you a new calibrator, you need good arguments. Often, what is important to you may not be as important to your boss. So, you need to be clever and speak “boss language”, presenting the arguments that are important to your boss!

In this blog, I look at how you can talk to your boss to convince them to get you that new, shiny calibrator. Let's dive in and unlock the secrets to getting that "yes!"

 

First, let’s look at the needs of calibration technicians. Then, I list some of the things that typically matter to bosses and managers - the decision makers. Finally, I’ll discuss how you should present your arguments to your boss to get approval for buying your new calibrator.

 

What matters to calibration technicians

Let’s briefly look at the things that normally matter the most to the people who are using the calibrators. Often, they are calibration technicians, or calibration engineers.

  • Less to carry – The calibrator should be multifunctional, so that you don’t need to carry several separate tools with you out in the field.
  • Easy to use – You need to do many different jobs and use many different systems and tools, so the calibrator should be easy to learn and to use. You don’t necessarily use the calibrator every day, so it must be easy to use.
  • Accurate - Good accuracy is naturally a must-have. You can’t calibrate and adjust field instruments properly if your calibrator is not accurate enough. Field instruments are improving and getting more accurate, so should your calibrators.
  • Automation – If your calibration tools can automate part of your work, that is a great time saver.
  • Automatic documentation – Since you need to document the calibration you do, it is great if the calibrator can do the documentation automatically so you don’t need to play with pen and paper.

 

What matters to the bosses/managers

Of course, the priorities and well-being of calibration technicians are important to any manager. But still, the things that matter most to managers are usually different to what matter to technicians.

Typically, the following things are important to managers:

  • Costs and ROI – Making sure that the operational costs do not exceed budgets. And that any new investments provide a good return on investment (ROI).
  • Productivity and efficiency – Doing more with less. There seem to be fewer resources everywhere, but you still need to get more and more done.
  • Digitalization – It is very difficult to find a plant these days that doesn’t have digitalization initiatives ongoing.
  • Reliability and downtime reduction – Making sure that processes run reliably, and any downtime is minimal.
  • Regulatory compliance – It’s important to ensure that processes are compliant with all relevant standards and regulations.
  • Safety and risk management – The safety of workers (and customers) and risk management are important.
  • Training and skills development – Employees need to be trained to make sure their skills stay up to date.
  • Data quality and integrity – The quality and integrity of calibration data needs to be ensured.
  • Sustainability and environment – Environmental considerations such as waste and energy reduction and effluent monitoring need to be taken into account in operations.

 

How to convince your boss

As you saw, there are some differences between the things that matter to technicians and the things that matter to bosses. So how do you discuss with your boss to get the approval to buy your new calibrator?

Obviously, you still have the reasons that matter most to you, but you need to focus on the things that are important to your boss.

Your discussion topics could include following:

  • Productivity and efficiency – Highlight that the automated features of the new calibrators make you and your team more productive and efficient, so you get your job done faster and better.
  • Data quality and integrity – Using modern documenting calibrators will automate documentation, not only making your job more efficient, but also improving the quality of data. This is because it reduces the human errors always present in manual documentation.
  • Costs and ROI – Although new calibrators will always come with a price tag, improved efficiency ensures a good ROI and short pay-back time.
  • Digitalization – New modern documenting calibrators will take the first important steps towards digitalization of your calibration processes. Every boss love digitalization! In future, you can combine those documenting calibrators with calibration management software, and you have digitalized your calibration ecosystem and turn it paperless! Down the road, your calibration software can be connected to your CMMS system to digitalize and automate also your work order delivery.
  • Regulatory compliance – A digitalized calibration ecosystem makes it easier to comply with quality standards and regulations. It also makes any audits so much easier.
  • Training and skills development – New calibrators with a modern user interface are easier to use and easier for new workers to learn.
  • Safety and risk management – In case you need to work in hazardous areas, having intrinsically safe calibrators will make it so much safer to work. It also makes it more efficient as you don’t need to work with hot work permits and carry gas detectors like you do with regular calibrators.

 

Summary

So, there you have it! Getting your boss to buy a new calibrator isn't about listing all the cool features you want. It's about understanding what matters to them and framing your arguments in a way that speaks to their priorities. By focusing on the right things, you'll make a strong case for why this investment makes sense for the whole team. Use these tips, you’ll be able to speak your boss's language, bringing you one step closer to working with that shiny (hopefully green) new calibrator. Good luck! Please let me know how it goes!

If you are ready for some commercial content, please read on.

 

Discover your potential savings!

Convincing your boss to invest in a new calibrator is easier when you can demonstrate significant time and cost savings. Use our Calibration Savings Calculator to see how much your organization can save. By inputting a few details about your current calibration processes, you can uncover the potential financial benefits and efficiency improvements.

Start calculating your savings now and make a compelling case for your new calibrator!

Access Calibration Process Savings Calculator >>

 

Look no further for the new calibrator!

So where do you find that dream calibrator that fulfills all the arguments listed above?

Well, I’m glad you asked! :-)

Check out the Beamex MC6 family of calibrators – a series of advanced, truly multifunctional calibrators designed to digitalize and revolutionize your calibration work!

The Beamex MC6 family includes:

  • MC6 Advanced Field Calibrator and Communicator: The all-in-one solution for versatile field calibration. It combines advanced process calibration functionality with a built-in communicator, making it ideal for on-the-go calibration tasks. Its high accuracy and robust design ensure reliable performance in various field conditions. 
  • MC6-Ex Intrinsically Safe Advanced Field Calibrator and Communicator: Safe and reliable for hazardous areas. Designed to meet stringent safety standards, it ensures precise calibration in potentially explosive environments. The MC6-Ex is indispensable for industries requiring strict safety protocols. 
  • MC6-T Multifunction Temperature Calibrator and Communicator: Specialized in precise temperature calibration. It offers unique features for accurate automated temperature measurements, making it indispensable for temperature-critical applications. Its multifunctionality and ease of use make it a valuable tool for any calibration task. 
  • MC6-WS Workshop Calibrator and Communicator: Optimized for comprehensive workshop calibration. It provides extensive calibration capabilities in a stationary setup, making it perfect for detailed and routine calibration tasks in the workshop. Its high accuracy and automated features enhance efficiency and reliability. 


Beamex MC6 family of calibrators

 

Common key features and benefits of the MC6 family

  • Multifunctionality: Calibrate pressure, temperature, electrical signals, and more.
    • Your benefit: Reduces the need for multiple devices, simplifying your toolkit and saving space. Carry less!
  • High accuracy: Ensure precise calibration with industry-leading performance.
    • Your benefit: Achieves reliable and consistent results, meeting rigorous industrial standards.
  • User-friendly interface: Navigate effortlessly with an intuitive touchscreen.
    • Your benefit: Saves time and reduces training requirements, making it easier for technicians to operate.
  • Documentation: Automatically document your calibrations.
    • Your benefit: Streamlines compliance and reporting processes, reducing manual data entry and potential errors.
  • Durable design: Built to withstand demanding environments.
    • Your benefit: Increases longevity and reliability, providing a robust solution for field and workshop use.

 

Calibration Management Software

Combine an MC6 family calibrator with our calibration management software for a fully digitalized and paperless calibration ecosystem.



Beamex calibration software

 

Ready to take the next step?

Ready to take the next step and upgrade your calibration process? Here are some ways to get started:

  • Book a meeting: Schedule a meeting with our experts to discuss your specific calibration needs and find the best solutions.
  • Request a demo: Experience the MC6 family in action by requesting a live or online demo.
  • Contact us: Reach out to our team for any inquiries or to get a personalized quote.

 

 

 

Topics: Calibration, Calibrator

Hysteresis in pressure calibration: What you need to know

Posted by Heikki Laurila on May 23, 2024

Blog size Customer service v2 (2)

 

Pressure calibration is crucial for ensuring the accuracy and reliability of process instruments used across various industries. One often overlooked but critical factor in this calibration process is hysteresis. Understanding hysteresis and its implications can help improve the accuracy and consistency of your pressure measurements. In this blog, I’ll dive into what hysteresis is, why it matters in pressure calibration, and how you can manage it effectively.

While hysteresis can be found in various types of measurements, such as temperature and electrical signals, this blog focuses on its impact on pressure calibration, where hysteresis is most significant.

 

Table of contents

 

What is hysteresis?

Hysteresis is a phenomenon where the output of a system depends not only on its current input but also on its history of past inputs. In simpler terms, it means that a pressure sensor might not return to its original state after being subjected to varying pressures. This lag or difference can affect the accuracy of the measurements.

For example, if you increase the pressure to a certain value and then decrease it back to the same value, the instrument might show a different reading compared to the initial one. This difference is hysteresis.

For a practical example, if you calibrate a 100 kPa pressure instrument at a 50 kPa point, it may show 49.95 kPa with increasing pressure. With decreasing pressure, at the same 50 kPa point, it may show 50.05 kPa. This difference between 49.95 kPa and 50.05 kPa is caused by hysteresis.

 

The image below shows a simplified illustration of hysteresis. Increasing and decreasing pressure do not follow the same line - there is a clear difference, which is hysteresis.

Hysteresis_v2_crop

 

Hysteresis in pressure calibration

In the world of process instruments, hysteresis can have a significant impact on calibration. Pressure instruments – such as transmitters, sensors, and gauges – are expected to provide precise and repeatable readings. However, due to hysteresis, the readings can vary based on the instrument’s past pressure exposures. This can lead to errors and inconsistencies in your pressure measurements, which can be critical in processes where precision is key.

 

Causes of hysteresis in pressure instruments

Several factors can contribute to hysteresis in pressure instruments, such as:

  • Material properties: The materials used in the construction of pressure-sensing elements can cause hysteresis due to their inherent properties.
  • Design factors: The design and construction of pressure-sensing elements, including their mechanical components, can influence the level of hysteresis. Often in pressure sensors, the pressure stretches mechanical parts that can have a mechanical hysteresis, causing pressure measurement hysteresis.
  • Contamination: Dirt or other contaminants inside the instrument can cause hysteresis by obstructing the movement of mechanical parts, leading to inaccurate readings.
  • Environmental influences: Temperature changes, humidity, and other environmental conditions can affect the hysteresis behavior of pressure instruments.

 

Identifying hysteresis

To manage hysteresis effectively, it’s essential first to identify and measure it accurately. Here are some techniques:

  • Up and down calibration: Conduct calibration by increasing and decreasing the pressure to identify any differences in the readings at the same pressure points. Please note that if you don’t wait long enough for the readings to stabilize, any delay or lag in the measurement instrument can look like hysteresis.
    If you generate pressure with a hand pump, you need to be careful not to overshoot (or undershoot) when generating calibration points, or you may lose some of the hysteresis effect. For example, you need to approach the increasing points from below, and not overshoot and come back down.
  • Calibration cycles: Perform multiple calibration cycles to observe any discrepancies or repeatability issues in the readings. If there are any repeatability issues with the instrument, it may look like hysteresis. Therefore, it is good practice to perform several calibration repeats to reveal repeatability issues. Fully automated pressure calibration obviously makes it easier and saves time when performing multiple repeats.
  • Graphical analysis: Plotting the pressure input vs. output readings can help visualize hysteresis. It may be very difficult to see the hysteresis in numerical results. If you have a pressure calibrator that displays the calibration results in graphical format (such as a Beamex MC6 family calibrator), it is much easier to identify hysteresis. 
    Sending calibration results to calibration software also helps, as the software often offers graphical presentation results (at least Beamex Calibration Management Software does).

 

Mitigating hysteresis 

While hysteresis cannot be completely eliminated, it can be managed and minimized. Here are some best practices to help you do this:

  • Regular calibration: Calibrate regularly, with up and down cycles, to identify hysteresis.
  • Instrument selection: Choose high-quality pressure instruments with low hysteresis characteristics for critical applications.
  • Consistent procedures: Follow consistent calibration procedures to ensure the repeatability and reliability of results.
  • Instrument cleanliness: Ensure that instruments are clean and free from contaminants that could affect their performance.
  • Environmental control: Whenever possible, maintain stable environmental conditions during calibration to reduce external influences. Of course, this is not always possible when calibrating instruments in field conditions.

 

Hysteresis in pressure switches

With any switches, including pressure switches, there is a hysteresis-like feature called “deadband”. This means that the switch has been designed so that there is some difference between the opening and closing points with increasing and decreasing pressure. This may seem a lot like hysteresis, or even be called hysteresis, but it is not actual hysteresis.

This deadband is needed and important in switches, otherwise the switch could start oscillating between open and closed when the pressure is at a certain value. Because switches are used to control specific operations, this is undesirable. Anyhow, you can learn more on pressure switches in this blog post: Pressure Switch Calibration.

 

Conclusion

Hysteresis is a critical factor to consider in pressure calibration, especially in the world of process instruments, where precision is paramount. By understanding what hysteresis is, identifying its causes, and implementing best practices to manage it, you can ensure more accurate and reliable pressure measurements.

 

Beamex solutions

At Beamex, we have worked with pressure calibration for 50 years, so I am confident when I say that we know something about it.

We offer tools and services that meet the highest standards in the industry, based on our long experience and strong commitment to innovation.

Learn more about our solutions related to pressure calibration:

To discuss with our calibration experts, please contact us.

 

Free Pressure Calibration eBook

Download this free 40-page pressure calibration eBook, which includes detailed strategies and resources for calibrating your pressure instrumentation.

Read more and download free pressure calibration eBook >>

 

 

 

Topics: Pressure calibration

With empathy to excellence - The secret to great customer service

Posted by Pekka Videnoja on Apr 03, 2024

The secret to great cusomter service

Empathy is an essential element of great customer service. Along with understanding, it’s the most powerful asset for any customer service organization. Putting time and effort into understanding your customer’s business and processes and empathizing with their problems and pain points will put you in a far better position to deliver great customer service.

In this blog post, we discuss the vital role of empathy, understanding, and a human touch in delivering exceptional customer service - before diving into more details about how Beamex Calibration Solutions Group can help you find a better way to calibrate.

 

Listen to what the customer is saying

Customer service can sometimes feel like a game of table tennis. The customer throws out questions, the customer support contact throws back answers, and on and on (and on) we go. The person in contact with the customer might have the knowledge and the skills, but are they really listening? If the support contact is just interested in shifting the ticket on to someone else or marking it as done, they’re not making the customer feel like they were listened to or understood. This game of table tennis is heading for a frustrating draw where neither side is satisfied and the problem is still there.

 

Understand and appreciate their needs

When the customer puts their trust in your solution, they need to feel appreciated when you are serving them, right from day one. Even if day one is a Friday. We’ve all been there – it’s Friday and you just want to start the weekend, and in comes a phone call or email with a complicated problem to solve. But it’s Friday for the customer too, and they’re not contacting you because they’re bored; they’re doing it because they genuinely need your help – and they have a right to be heard.

Now the real hard work begins. When you take the time to listen and properly understand what the customer’s situation is, you can work out how critical it is and weigh up the best way forward. Is an immediate solution needed? Or would it be better to take the time to go over the information internally and organize a call on Monday to walk through the problem and ask the right questions, instead of rushing in with poorly planned quick fixes now? This kind of empathy and willingness to cooperate to find a solution can reassure the customer that help is on the way.

Of course, there are no guarantees that you’ll understand everything right away, but you will certainly have a more cooperative and less angry customer on the other end of the line if you give the impression that you are making a genuine effort to understand their situation.

 

Look for a viable solution, not a quick fix

Customer-facing experts are often dealing with customers working in a busy production environment or process where things can’t simply grind to a halt. When they understand this and empathize with the customer’s problem they can provide useful answers and workable solutions. It makes no sense to look for a quick fix – a solution that might solve the problem but is completely unworkable in regular operation. There are probably many viable paths forward, and a touch of empathy and a healthy helping of technical knowledge can help to identify the optimal solution in both the short and the long term.

 

Give a human touch in the digital age 

In pre-COVID times the world was a very different place. Remote working and remote meetings were far less commonplace, certainly in our line of business. The technical meetings and workshops I was involved with were almost always face to face. Things have moved on since then, and customers are far more willing to jump online and meet with the help of digital tools. The trick is to get past the email tennis barrier once again. An email from a customer is them reaching out to us, maybe even a cry for help. In non-urgent cases, replying to acknowledge their issue and proposing an online meeting in a few days’ time gives you the chance to gather the information you need and prepare a viable solution, which can then be discussed face to face.

The word “prepare” carries a lot of weight here. Without putting the work in to prepare between acknowledging the customer’s problem and meeting them to look for a solution, the next contact you have with them could be a frustrating waste of time. Maybe they don’t have certain admin permissions they need to show you what’s going on with their devices or processes. Perhaps you need someone from IT in the meeting with you to facilitate the discussion. When you’re prepared and know what to ask for, you’re also being empathetic by showing the customer that you understand their need to feel secure, looked after, and cared for.  

 

Assumptions are not your friend

For me, two of the biggest barriers to delivering great customer service are assumption and pre-judgement. If you go into a situation with the assumption that A or B has happened on the customer side and therefore the solution is C, you’re already limiting your options and not demonstrating a willingness to understand the customer’s situation. Instead, you are looking to reinforce your own preconceptions and deliver a cookie-cutter solution.

In customer service, one size does not fit all. At Beamex, in our experience when a customer contacts us with an issue they are often just describing a symptom of a problem rather than the problem itself. If you then go on to assume things based on a narrow view of a wider problem, then you’re never going to get to the root cause.

With empathy and an open mind, problems can be solved faster, in a way that is more satisfying to everyone involved.

 

Don’t discount the importance of soft skills

Given what we do at Beamex – making the world a safer and less uncertain place by helping customers find a better way to calibrate – technology and technical skills are very important. But in customer service, it’s easy to forget that we are still humans dealing with other humans. Soft skills like empathy and cultural understanding are critical to delivering great customer service.

While the world and its industries are becoming increasingly digitalized, humans will always be analog. As analog beings in a digital world, empathy is a way to set ourselves apart from artificial intelligence. Great customer service is built on what I call the pyramid of strength formed by appreciation, understanding, and solution – and that’s one strength that AI can’t offer.

 

Technology alone is not enough

At Beamex we believe that technology alone does not provide a better way to execute and manage your calibrations unless it is adapted to the customer’s specific needs. Through empathy and understanding, we aim to deliver a solution that is adapted to your specific needs. Our approach is a holistic one, where we aim to be your partner for calibration excellence throughout the calibration solution lifecycle.

When we advise, it is based on an evaluation of your current calibration process to identify room for improvement. The next step is to define which calibration technology and implementation services best fit your needs and then use them to deliver a better way to calibrate. We guide you throughout the adoption process to ensure that your new calibration solution becomes an integral part of your daily operations.

Beamex technology

Customer quote

 

The three areas of Beamex Calibration Solutions Group

Expert services focus on understanding the customer’s problem and advising them on the best way to overcome their current challenges. The next step is working with the customer to define what their new calibration solution will look like and how to map their processes to the Beamex solution before delivering it. Delivery includes introducing the solution and performing instrument data migration. This step can be provided as a service, or the customer can perform it by themselves. A full-scale solution with integration, validation, and SOP creation services is typically required by larger customers.

Training services are available to train the customer’s technicians and engineers on how to use their tailored Beamex calibration solution and how to get the best from it throughout its lifetime. These can be delivered both remotely and on site, and are always tailored according to the solution in question.

Support services are there to make sure the customer is never on their own, with a Beamex advisor always on the end of the phone or available via email to provide helpdesk-type support. When a customer first starts using their Beamex solution, they have the extra peace of mind provided by a ‘hypercare’ period. This elevated level of support is crucial to help them get comfortable with the new solution. 

 

Customer quote

 

Learn more about our expert, training and support services by talking to a Beamex calibration expert

 

Beamex case stories

Many companies have found a better way with Beamex – read their success stories:

Find all case stories here.

 

Customer quote

 

Beamex - Your partner for calibration excellence

 

Topics: Calibration process, Calibration management

Calibrating for a Cleaner Future - Unlocking the Potential of Waste to Energy

Posted by Monica Kruger on Feb 13, 2024

Waste-to-Energy (WtE) has been around since the first waste incinerator was built in 1874, but the sustainability challenges of today – combined with innovative new technologies – are revolutionizing the industry. A World Energy Council report valued the global WtE market at 9.1 billion USD in 2016 and it is projected to increase to over 25 billion USD in 2025, driven by an increase in waste production, growing populations, and urbanization.

Our recent white paper "Waste to Watts – Unlocking the Potential of Waste to Energy" takes a deep dive into WtE and shows some real-life examples of how innovative technologies, calibration, and accurate measurements are driving the industry.

 

Waste to Watts – Unlocking the potential of WtE

In our white paper, we provide an in-depth examination of WtE and showcase examples of how new technologies, calibration and accurate measurements are propelling the industry forward. With plenty of clear infographics, facts, and figures, the white paper covers:

 

  • Waste-to-Energy: The Encyclis story – a modern WtE success story
  • A growth sector in the making – showing a snapshot of the industry
  • Waste-to-Energy around the world – how different countries are embracing WtE
  • The technology of the future – the technologies moving WtE beyond incineration
  • Challenges to growth – from public perception, to sustainability and efficiency
  • The role of calibration – the importance of accurate measurements
  • Accelerating Waste-to-Energy – and why it’s important
  • Unlocking future potential – a clear overview of the path ahead

This blog post gives you a small taste of what you can find in the white paper – download it now to read the full story. 

 

WtE has many benefits over landfill

Modern WtE plants allow hazardous organics to be safely managed within the waste stream while facilitating the recovery of both ferrous and non-ferrous metals, including valuable metals. They also make hydrochloric acid and sulfur recovery feasible – raw materials that can be used in gypsum board production. Even the ash residue from the process has many applications in the construction industry and can be used instead of concrete. All this is on top of the energy WtE plants can generate for homes and businesses.

The fact that organic pollutants are destroyed in the process and inorganic pollutants, especially heavy metals, are extracted and transformed into insoluble substances is a key advantage of WtE over landfilling. In this way the process contributes to a circular economy where waste materials are recycled, reused, or made inert to minimize their environmental impact. A common misconception of incinerators is that they are dirty and polluting – in fact, the exhaust from the stacks of a modern plant is extremely clean, often cleaner than the air surrounding the plant, as only cleaned gasses and water vapor are released into the atmosphere.

 

The benefits of modern technology

Digitalization is revolutionizing the WtE industry, driving process improvements and enabling transparency and third-party oversight. “Certain players in the industry manage up to ten plants and are streamlining their calibration and maintenance programs for consistency,” shares Christophe Boubay, Sales Director and Country Manager for Beamex France. “By doing so, they can assess and replicate successful practices through cloud-based solutions from one plant to another.” This approach ensures on-site technicians and remote management have a comprehensive overview of operations and can control the plants to maximize overall efficiency, minimize waste, and ensure end products are suitable for various applications.

Calibration in Waste to Energy

 

Accurate data for continuous improvement

The WtE process is highly regulated with many rules, frameworks, and standards that operators must follow, both when it comes to the waste that is fueling the process as well as factors such as wastewater disposal and the proper handling of scrap metal and ash by-products. In addition, WtE plants must demonstrate that they are recovering waste and not just disposing of it. These regulations make precise measurements essential in order to be able to monitor and improve environmental performance. To ensure measurements are accurate, calibration is vital.

Properly calibrated tools help ensure that a WtE plant’s pressure and temperature instruments for the incinerator and boiler control process are working at high levels of accuracy, for example. This ensures optimum instrument performance, resulting in higher efficiency and reduced levels of CO2 entering the atmosphere. Compliance with emission restrictions is crucial – exceeding them can result in heavy financial penalties and even plant closure.  Modern calibration software helps manage the calibration of stack emission instrumentation, thus ensuring continuing compliance with local and national regulations. The use of digital calibration certificates also makes it easier for WtE facilities to share calibration data for emissions monitoring, auditing, and regulatory compliance purposes.

 

From waste to watts: a WtE success story

Every year Encyclis’s Rookery South Energy Recovery Facility (ERF) in Bedfordshire, England, takes 550,000 tonnes of waste that would otherwise end up in landfill and turns it into 60 MW of sustainable energy for around 112,500 homes. Thousands of tonnes of ash are also produced for the construction industry. The site has been operating since January 2022 and is one of three WtE plants operated by Encyclis, with two more under construction. Encyclis collaborates closely with waste management companies, recovering valuable resources and by-products for reuse and contributing to the circular economy by turning household and commercial waste into a valuable resource. All Encyclis plants use continuous real-time monitoring to adhere to strict Environment Agency emission limits. 

Encyclis’s Rookery South Energy Recovery Facility

Encyclis chose Beamex to provide a comprehensive calibration ecosystem for the Rookery South ERF, including CMX Calibration Management Softwarethe bMobile Calibration ApplicationMC6 Advanced Field Calibrator and CommunicatorsMC6-T Multifunction Temperature Calibrator and Communicators, pumps and expert services. The company has also collaborated with Beamex to equip its WtE facility in Newhurst, UK. For both plants, Beamex has been involved from the early stages, before the commissioning phase. This approach makes it possible to apply best practices based on decades of experience and ensure that the resulting calibration solution is user-friendly and enables seamless data exchange.

A detailed calibration procedure was also integrated in the plants, specifying calibration schedules and test parameters. This information was then synchronized with handheld devices used by technicians and engineers. All workers have to do is connect to the instrument being calibrated and perform the calibration. The Beamex software does the rest, calculating the pass or fail result, updating the digital certificate, and resetting the recalibration date for future reference.

Nick Folbigg, Electrical, Control and Instrumentation Team Leader at Encyclis, likens managing an ERF to assembling a complex jigsaw puzzle: “Numerous parts need to align seamlessly for effective, efficient, and compliant operation. Using the complete Beamex calibration solution is a key piece of this puzzle.”

 

The future of WtE

Regulation has a role to play in helping the WtE industry reach its full potential and claim its place in the circular economy. The US, for example, faces significant financial hurdles in transitioning away from landfill-based waste management systems, but regulations preventing the disposal of untreated organic waste would be a practical approach to accelerating the change. After all, as Phillipp Schmidt-Pathmann of the Institute for Energy & Resource Management points out, the average person in an integrated waste management-based system usually pays less than in a landfill-based system. Governments should also introduce a market mechanism for Carbon Capture, Usage and Storage, which will encourage more investment in WtE.

WtE plants can then focus on what they do best: reclaiming precious metals, boosting revenues with local secondary raw material streams, supporting construction, reducing resource demand, supplementing grid power, and making drinking water production more sustainable for local populations. WtE’s ability to convert electricity into hydrogen should also be exploited to allow for zero-emission buses and waste transportation. Together, we can help transform waste into a more sustainable future for us all.

To read more about the Rookery South ERF and WtE, download our white paper: Waste to Watts – Unlocking the Potential of Waste to Energy. 

 

Related content

Topics: Calibration management, Digitalisation, sustainability

How an accurate, reliable calibration solution could supercharge your business [Podcast]

Posted by Monica Kruger on Jan 09, 2024

Whether you work in manufacturing, pharmaceuticals, healthcare, or any other field that relies on precise measurements, an accurate and reliable calibration solution is crucial. Accurate calibration is a critical component of quality control, compliance, safety, and cost efficiency. So, what is the best way to guarantee accurate measurements, reliable data, and traceability?

Two Beamex experts recently guested on the Process Industry Informer podcast to discuss this fascinating topic. The episode includes a real-world example of one major Beamex customer that has cut costs while boosting efficiency by adopting a centralized, standardized process for recording, storing, and analyzing calibration data.

Calibration Consultant Michael Frackowiak and Director of Sales for the UK & Ireland John Healy at Beamex sat down with host Dave Howell for a fascinating talk about the key role that calibration plays in industrial processes and how the Beamex calibration ecosystem helps customers to simplify and enhance their calibration processes.

 

Listen to the podcast: 

  

 

You can also find this podcast on the Process Industry Informer website

 

Table of content

 

Time Subject
0:00 - 1:20General introduction
1:20 - 4:10Mike's background and calibration expertise and John's role and experience at Beamex
4:10 -  6:50The role and importance of calibration
6:50 - 13:20Common challenges and industry needs in calibration
13:20 - 17:40The importance of data in calibration
17:40 - 28:30Detailed insight into National Gas's (formerly National Grid) calibration journey
28:30 - 35:00The value of partnership and trust in your calibration technology provider
35:00 - 38:00Beamex's comprehensive calibration solution and educational resources

 

Getting calibration right is fundamental to a successful, safe business 

Discussing the role that calibration plays in the industry, Frackowiak highlights that getting calibration right is as important as getting your product quality and on-site safety right – and that products alone are not enough. “In the end, calibrators are just boxes; to get the most from them, customers need support, expertise, and knowledge.”

“Beamex’s purpose is to provide the customer with a better way to calibrate,” Healy explains. “We are working across many different industries, some highly regulated like pharma, where tolerances and calibration requirements are strict. Many conversations we have revolve around how to move away from error-prone manual recording of calibration data towards a more automated approach. These conversations are the starting point to find out what they need from their calibration process.”

 

Evolving regulations are an opportunity to identify areas for improvement

As standards and regulations evolve across different industries, Beamex takes the opportunity to meet face to face with customers, for example at its annual Pharmaceutical User Forum, to discuss what these changes mean in practice. “The insights we gain from these kinds of forums are invaluable in terms of learning how we can better support customers moving forward,” Healy says.

 

Making sense of the flood of calibration data

Data generation and analysis in process industries has exploded in the last decade. Operators are gathering more data than ever before about their processes as they seek improvement opportunities. How does Beamex help customers make sense of this flood of data? “Working out what to do with the massive amounts of calibration data being generated is a huge challenge for many industries,” Frackowiak points out. “As calibration experts we want to help take the load off customers’ minds. They shouldn’t need to think about calibration data. We give them the calibratorsthe software, and the back end – everything they need to make sense of the data and make good decisions based on it, faster,” he continues. “The calibrators we provide take care of the accurate measurement, but where we add real value is with the ecosystem around calibration as a process, as a decision-making support tool.”

 

A centralized asset data resource for National Grid

As part of the podcast the panel discussed Beamex’s collaboration with National Grid, which uses Beamex CMX Calibration Management Software to centralize asset data in a single system. “In a nutshell, this case was about helping National Grid work out the best way to extract, interpret, and make the best use of the data they had been gathering,” Frackowiak says. “This was a really exciting journey on both sides,” Healy says. “The end goal was a centralized, standardized solution for gathering, storing, and analyzing data on asset performance at their gas compressor sites. The data islands they had made it very difficult to accurately assess asset performance, and there was no standardized calibration procedure across the sites.”

“Our solution for National Grid has three main components,” Frackowiak says. “There are the calibrators themselves, the software, and then our expertise and training to guide the customer through the implementation process. This third element is what helped us map out and design a system that would meet the customer’s needs precisely.”

“The operational team at National Grid saw the value of doing things the Beamex way – the time and hassle doing things this way would save them,” says Healy. “When management could see the cumulative impact of this across multiple sites and the huge benefits of having true visibility over their asset data, they were quickly onboard too.”

Using the Beamex system, National Grid has seen a saving of 4,000 hours per year performing calibrations, resulting in millions of pounds of financial savings.

 

 

Beamex is there every step of the way

Discussing the complexities of these kinds of customer cases, Frackowiak continues by emphasizing how Beamex’s approach sets them apart in the market. “These kinds of projects take time, but we are there to be the partner for calibration excellence, supporting the customer at every step of the transformation process. This is what makes Beamex far more than just another technology provider. We are a trusted partner, a trusted advisor – we are the calibration specialists who are looking 5, 10, even 15 years ahead together with the customer through the Beamex calibration ecosystem.”

 

Listen to the podcast: 

  

 

You can also find this podcast on the Process Industry Informer website

 

Discuss with our experts how a calibration solution could supercharge your business

 

 

Topics: Calibration management

Industrial Temperature Calibration Course [eLearning]

Posted by Heikki Laurila on Dec 12, 2023

Industrial Temperature Calibration Course, eLearning

In this blog post, we want to share a new way to learn more about industrial temperature calibration without attending in-person calibration classes. Our new industrial temperature calibration eLearning course will help you level up your calibration knowledge with six in-depth modules. And best of all, it’s completely free!

The calibration course offers a wide range of resources, including executive summaries, in-depth articles, how-to videos, quizzes, and a comprehensive final test. If you pass the final test in the end,  you’ll receive a certificate.

Read more and start the temperature calibration course now!


What you’ll learn in the course

  • The calibration basics: what, why, and how often?
  • Temperature sensors (RTSs and thermocouples) and temperature units
  • How to calibrate Pt100, duplex, and sanitary sensors
  • How to calibrate temperature switches and transmitters
  • Calibration uncertainty: what it is and why it matters
  • How to calibrate temperature instruments


    Enroll and start now!

 

Course overview

Want to know more about what you can expect from your calibration classes? Here’s a short overview of the key areas in the calibration course.

 

 

1. Calibration basics

However much you already know – or think you know – about calibration, it’s always good to go over the basics. In this section you will learn all about the fundamental concepts of calibration, the reasons to calibrate, why calibration is important, and the critical issue of traceability. We’ll also explore how frequently instruments should be calibrated to maintain accuracy. Once you finish this section of the course you’ll have a solid foundation of the essentials of calibration.

 

What is calibration? - Beamex blog

 

Extract from the eLearning course and calibration basics section.

 

2. Temperature units and sensors

In the second section of the course we’ll dive into the world of temperature measurement, gaining insight into different temperature units and temperature unit conversions. You’ll also learn more about Pt100 sensors and thermocouples and how and where they’re used. Once you’ve covered this section of the calibration training you’ll be ready to combine your knowledge of calibration and temperature to find out about calibrating temperature sensors and sanitary sensors.

Sanitary temperature sensor calibration - Beamex blogSimplified illustration of thermocouple cold junction.

 

3. Calibration of temperature sensors and sanitary sensors

Are you looking for calibration training for temperature sensors and sanitary sensors? This section of the course will help you to master the techniques and procedures for calibrating these types of sensors. There will be a particular focus on the unique challenges presented by sanitary sensors – and how to overcome them.

 

Sanitary sensor calibration - Beamex blogSanitary temperature sensor calibration

 

 

4. Calibration of temperature switches and transmitters

In the fourth section of the calibration course we’ll examine methods for calibrating temperature switches and procedures for calibrating temperature transmitters. You’ll also find out how to ensure precise temperature control.

Temperature switch calibration - Beamex blogTemperature slope in temperature switch calibration.

 

 

5. Calibration uncertainty

Calibration uncertainty is an essential factor to understand when performing industrial temperature calibration. In this section of the course you’ll learn to navigate calibration uncertainty and evaluate it effectively. You’ll also discover how to manage uncertainty in temperature calibration. Finally, we’ll explore the concept of temperature dry block uncertainty, a crucial aspect of temperature calibration.

 

Calibration uncertaintyCalibration uncertainty

 

6. How to calibrate temperature instruments

In the last section of the course we’ve gathered some goodies for you, with exclusive access to webinars that provide hands-on calibration training for temperature instruments. These webinars will enhance your practical skills and give you added confidence in your calibration knowledge.

Temperature instrument calibration webinar - Beamex blogWebinar: How to calibrate temperature instruments.

 

Start learning today!

The full calibration course will take you around eight hours – but if you have existing knowledge you might complete it more quickly. Just remember to sign into our eLearning service so you can save your progress and split your calibration training over multiple days. Once you have successfully completed the final test in the end you’ll receive a certificate via email.

 

Temperature Calibration eLearning

 

Master temperature calibration with this free comprehensive course. Deepen your knowledge, pass the quiz, and earn your certificate!

Enroll and start now!


 

Beamex's offerings for temperature calibration

At Beamex, we have a lot to offer for temperature calibration.

For example, our most versatile temperature calibrator Beamex MC6-T Multifunction Temperature Calibrator and Communicator, the easy to use Beamex MC6 Advanced Field Calibrator and Communicator Beamex RPRT reference sensors and several Beamex Temperature Sensors.

Don't forget our calibration software offerings and the entire calibration ecosystem.

We also offer expert services and training services for temperature calibration.    

You can also download a free temperature calibration eBook, and visit the handy temperature unit converter on our website. 

Please feel free to contact us to discuss on your temperature calibration challenges and how we can be your partner for calibration excellence.

Please scroll through the carousel below for more interesting articles related to temperature calibration!

 

 

Topics: Temperature calibration

Is it a leak? - Understanding the adiabatic process in pressure calibration

Posted by Heikki Laurila on Nov 29, 2023

Is it a leak? - Understanding the adiabatic process in pressure calibration

The adiabatic process is something we have all encountered if we have been working with pressure calibration. Often, we just don’t realize it, and we think there is a leak in the system.

In short, the adiabatic process is a physical phenomenon that causes the pressure media’s temperature to increase when we increase the pressure in a closed system. When we stop pumping, the media’s temperature cools down, and it will cause the pressure to drop – so it does indeed look like a leak in the system.

You can find many in-depth, complicated physical or mathematical explanations of the adiabatic process in the internet. But hey, we are not physicists or mathematicians, we are calibration professionals! Lucky you, you’ve got me to simplify this for you :-)

In this article I take a closer look at the adiabatic process, how to recognize and avoid it. A little bit of compulsory theory to start with and then diving into practical things.

If you are working with pressure calibration, you cannot miss this one!

 

Table of contents

 

What is the adiabatic process?

An adiabatic process is a thermodynamic change whereby no heat is exchanged between a system and its surroundings.

For an ideal gas undergoing an adiabatic process, the first law of thermodynamics applies. This is the law of the conservation of energy, which states that, although energy can change form, it can't be created or destroyed.

We remember from our school physics (well, some of us may remember!) the formula with pressure, volume and temperature, and how they depend on each other. Remember? 

The combined gas law says that the relationship between pressure (P), volume (V) and absolute temperature (T) is constant. As a formula it looks as following:

Combined gas formula

Where:

  • P = pressure
  • V = volume
  • T = absolute temperature
  • k = constant

OK, that did not yet take us too far, but please bear with me…

When using the above formula and comparing the same pressure system under two different conditions (different pressure), the law can be written as following formula:

Combined gas law 2We can think of this formula as representing our normal pressure calibration system, having a closed, fixed volume. The two sides of the above formula represents two different stages in our system – one with a lower pressure and the second one with a higher pressure. For example, the left side (1) can be our system with no pressure, and the right side (2) the same system with high pressure applied.

Looking at the formula, we can conclude that as the volume of a pressure calibration system remains the same, and if the pressure changes, then the temperature must also change. Or the other way around, if the temperature changes, then the pressure will also change.

The image below shows a typical pressure calibration system, where we have a pressure pump, pressure T-hose, pressure instrument to be calibrated (1) and pressure calibrator (2).

 

Pressure calibration connection diagram

 

Typically, the volume of our pressure calibration system remains the same, and we change the pressure going through the calibration points. When we change the pressure (and the volume remains the same) the temperature of the medium will change. That’s physics, deal with it :-)

We can most commonly see the adiabatic process when we raise the pressure quickly with our calibration hand pump, causing the media (air) to get warmer. Once we stop pumping, the medium starts to cool down causing the pressure to drop - at first quickly, but then slowing down and finally stabilizing. This pressure drop looks like a leak in the system.

The same also happens with decreasing pressure – if we decrease the pressure quickly, the media gets colder. When we stop decreasing, the media will start to warm up, causing the pressure to rise. This may seem odd at first – how can the pressure rise by itself? Of course, the pressure does not increase a lot, but enough for you to see it and wonder what’s going on.

So, the adiabatic process works in both ways, with increasing and decreasing pressure.

The faster you change the pressure, the more the medium temperature will change, and the bigger effect you can see.

If you wait a while, the pressure media temperature will stabilize to the surrounding temperature and the effects of the adiabatic process will no longer be visible.

This is the essential learning from the adiabatic effect.

 

How do you know when it’s the adiabatic process and when it’s a leak?

The main difference between the adiabatic process and a leak is that the pressure drop caused by the adiabatic process is bigger in the beginning, then slows down and disappears (stabilizes).

The pressure drop caused by a leak is linear and continues at the same rate.

The below image demonstrates the difference:

Adiabatic process vs leak - graphicIn the above image you can see how the pressure drop caused by the adiabatic process is first fast, but then slows down and eventually stabilizes (red line). While the pressure drop caused by a leak is linear (blue line). 

 

How to avoid the adiabatic process?

Pressurize slowly:

One of the easiest ways to minimize the adiabatic effects is to change the pressure slowly. By doing so, you allow the media more time to reach the same temperature as its surroundings, minimizing any temporary temperature changes. In practice, if you increase the pressure with a hand pump, and you step through several increasing calibration points, this may already be slow enough to avoid seeing the adiabatic process.

If you pump as quickly as you can up to 300 psi (20 bar), then you will most certainly see the effect of the adiabatic process. 

Wait:

After adjusting the pressure, give it some time to stabilize. A minute or two should do the trick. This allows any temperature changes in the medium to reach equilibrium with the ambient conditions, and the pressure will stabilize accordingly.

 

Pressure media

You can also affect the adiabatic process with your choice of pressure media. In practice it is of course not always possible to change the media. Your normal hand pump uses the air as media. For higher pressure, you may use a hydraulic pump with water or oil as the medium.

The effects of the adiabatic process are generally more prominent in air or gas-operated calibration pumps than in hydraulic (water or oil) ones.

This is mainly due to gas being much more compressible, so the pressure increase will push gas molecules closer together, and this work done in gas is transformed into energy, causing heat. In addition , gas/air has lower thermal conductivity than liquids, so less heat is conducted away from gas.

 

Conclusion

In our service department, we regularly get questions about pressure pumps having leaks, while in most cases it turns out to be the adiabatic process that has made the customer think that there is a leak.

Understanding the adiabatic process and its impact on calibration pressure pumps is crucial for users to avoid misdiagnosing issues. By changing pressure at a moderate pace and allowing adequate time for stabilization, you can achieve more accurate and consistent results.

 

Beamex's offering for pressure generation and calibration

At Beamex, we have a lot to offer for pressure calibration.

For example, our PG range of calibration pumps for pressure generation, the ePG electric pressure pump, the POC8 automatic pressure controller, and a number of pressure calibrators

Don't forget our calibration software offerings, and the entire calibration ecosystem.

We also offer expert services and training services for pressure calibration.    

You can also download a free pressure calibration eBook, and visit the handy pressure unit converter on our website. 

Please feel free to contact us to discuss on your pressure calibration challenges and how we can be your partner for calibration excellence.

Please scroll through the carousel below for more interesting articles related to pressure calibration!

 

Topics: Pressure calibration

How Douglas Calibration Services reduces technicians’ workload and stress [Case Story]

Posted by Rita Patel on Aug 08, 2023

Douglas calibration LOGiCAL

Beamex LOGiCAL Calibration Management Software has helped Douglas Calibration Services improve quality of life for their technicians. Freed from mountains of paperwork, technicians can now provide faster, more efficient service for the company’s diverse customer base. Let’s explore how.       

Douglas Calibration Services employs 70 technicians and serves more than 130 clients across the pharma, energy, and food and beverage industries. The company uses the Beamex MC2, MC5, and MC6 documenting calibrators on a daily basis and had been looking for a cloud-based solution that would allow them to go 100% paperless.

Read the full case story >

 

Rescuing technicians from a mountain of paperwork

"We implemented LOGiCAL for all our clients in 2022. As it’s a cloud-based solution, there’s also no wastage or big initial outlay on infrastructure or implementation, and no hardware upgrades,” says Richard O’Meara, Contracts Manager at Douglas Calibration Services.

For Douglas Calibration Services, taking care of their technicians’ well-being was a strong driver behind their decision to adopt LOGiCAL. “We wanted to take the pressure off our technicians, who were drowning in paperwork,” Richard says. He estimates that the technicians’ workload has been reduced by 30–40% thanks to LOGiCAL.

Benefits of LOGiCAL calibration software in numbers

 

Not only has this improved life for the company’s existing workforce, but it also acts as a way to attract new employees in what is an extremely competitive market.

Instead of laptops and paper printouts, technicians use a tablet with the Beamex bMobile Calibration Application to perform all their calibration work. Results are captured by the technician in the field and synchronized with LOGiCAL, and admin staff can download and review them before emailing the calibration certificates direct to the client. Clients now receive their certificates on the same day or the day after the calibration work was done in 90% of cases.

Benefits of using LOGiCAL calibration somftware

 

No more missed calibrations

Prior to LOGiCAL, employees had to manually track recalibration due dates on a spreadsheet; LOGiCAL now does all this work with its ‘instruments due’ feature and also tracks reference standards so they never miss recertification. 

Quote by Richard O'Meara

 

During the transition to LOGiCAL, Beamex convened monthly meetings to discuss any issues or queries, and regular feedback from the team at Douglas Calibration Services has led to the implementation of a host of new features and updates. 

Richard sees plenty more potential in the solution: “We’re excited to work with Beamex to develop new features, like the ability for clients to log in to their database and view instruments and calibration results, for example. This is just the beginning of our shared journey!”  

 

Download the full customer success story

Schedule a free LOGiCAL demo

www.beamex.comwp-contentuploads202006Hero-LOGiCAL-17

 

Here's a very short video summary of the story:

 

Related content:

 

 

Download your copy of the Calibration Essentials Software eBook to learn more about calibration management and software

Calibration Essentials- Software eBook

 

 

Topics: Calibration process, Calibration software, Calibration management, Case Story

Partnering in calibration excellence at the Beamex Pharma User Forum

Posted by Monica Kruger on Jul 04, 2023

Beamex Pharma User Forum 2023

The biggest issues facing industry leaders from some of the largest pharma companies worldwide, including AstraZeneca, BioNTech, Novartis and GSK, include preventative maintenance, ensuring data integrity, complying with regulations and embracing digitalisation. This was on display at the 2023 Beamex Pharmaceutical User Forum.

Facilitating collaboration, knowledge sharing, and a customer-centric approach was at the heart of the conference, with one participant expressing the reassurance that comes from knowing Beamex hears its customers and takes feedback seriously. Alex Maxfield, the firm’s VP for Sales & Marketing, states, “We wanted customers talking among themselves, giving advice and exploring our applications.”

Marc Mane Ramirez from Novartis agrees, emphasising the invaluable role of the conference in gathering genuine feedback from seasoned users. This direct engagement enables Beamex to embrace and integrate valuable improvements and proposals into their forthcoming product releases.

 

Safeguarding data integrity and regulatory compliance

A critical insight from the conference was the significance of predictive maintenance for the future. Effectively maintaining the quality, safety and efficacy of pharmaceutical products, including equipment and instruments, is crucial to ensure compliance with stringent guidelines and regulations, such as those from the FDA and MHRA, upholding the highest quality and patient safety standards.

Calibration is crucial in maintaining consistent and reliable manufacturing processes that comply with these industry standards. Developed through decades of collaboration with leading pharmaceutical companies, Beamex’s ecosystem assists customers in achieving calibration-related goals while adhering to regulatory requirements, including the ALCOA+ principles that guarantee data integrity throughout the lifecycle of pharma products.

Data integrity is a top priority in the pharmaceutical industry, as breaches of regulatory requirements can have severe consequences. As such, the forum also showcased the impact of Beamex’s Calibration Management Software (CMX), which all participants have and use.

Shari Thread from pharma giant AstraZeneca said, “It’s been good to hear from other pharma companies using CMX and their experiences using CMX.” Carlos Da Silva from Boehringer-Ingelheim echoed the sentiment, stressing that through collaborative effort, we can drive continuous improvement in CMX and work towards achieving better outcomes in the future.

By transitioning to a paperless calibration management system, companies can streamline processes, reduce manual errors and enhance data integrity. CMX also ensures compliance with relevant GxP regulations while providing a robust calibration database with comprehensive history functions. 


Shaping the Beamex roadmap

The 2023 Beamex Pharmaceutical User Forum created an environment for attendees to learn from the experiences and proposals of their peers in the industry.

Through sharing successes and challenges, participants gained invaluable knowledge about effective practices and areas that require improvement within the pharmaceutical calibration and maintenance landscape. Mateusz Dunko from GSK expressed that it was gratifying to witness how pharma firms can influence the Beamex roadmap by comparing requirements across different companies.

This collaborative learning approach allows companies to explore diverse perspectives and discover innovative strategies to embrace digitalisation. In conclusion, Jan-Henrik Svensson, CEO of Beamex, underscored the transformative changes his company perceives in digitalisation and its profound impact. He noted that while all the companies were contemplating this shift, they were each doing so in distinct and remarkable ways, showcasing the industry’s collective drive for progress and adaptation.

 

Would you like to know more about the Pharmaceutical User Group, or are you interested in joining the next forum? Contact us.

 

View the below video for more insights and interviews from the event:

 
   
 

 

Many of the world’s leading pharmaceutical and life sciences companies depend upon Beamex calibration solutions. Book a free consultation with our pharma calibration experts to find the best calibration solution for you.

Book a free consultation

 

Related blogs


Customer success stories

Beamex customer success

 

Digital Calibration Certificate (DCC) – What is it and why should you care?

Posted by Heikki Laurila on Jun 15, 2023

Digital Calibration Certificate DCC - Beamex Blog

 

The digitalization of metrology has been slower than in many other fields, and calibration processes in many industries are still mostly paper based.

But that's about to change!

Enter the Digital Calibration Certificate (DCC), “the MP3 of metrology”. Just as the MP3 revolutionized the music industry, the DCC has the potential to revolutionize the process industry by enabling electronic storage and sharing of calibration results in a standardized, consistent, authenticated, and encrypted manner.

No more struggling with manual interpretation of paper certificates! With the DCC, calibration data is machine-readable and easily imported into your system. A DCC is created using digital signatures and encryption methods to ensure its authenticity and integrity, and it's compatible with international standards, making it easy to share with calibration laboratories, manufacturers, and users of measuring instruments.

But that's not all! The DCC has a ton of benefits, like increased transparency, efficiency, and traceability in the calibration process, as well as reduced costs and time. And the best part? A team of key players, including Beamex, is working on creating a global DCC standard so you won't have to worry about compatibility issues.

If you thought that a PDF is a digital calibration certificate, think again!

So, if you're in the process industry, keep calm and get ready to adopt the DCC! It could be the game-changer you've been waiting for. 

 

Download the full article in pdf format >>

 

Table of contents

 

Background

Metrology is a crucial aspect of modern industrial activity as it involves measuring and ensuring the accuracy of physical quantities.

However, the digitalization of metrology has been slower than that of other industries, with calibration processes still being mostly paper based. This means that processes relying on metrological data are often manually executed by humans, which can be slower and more prone to errors compared to machine-to-machine communication.

The growing gap between the digitalization of the process industry and the calibration industry is creating a significant discrepancy in terms of efficiency, productivity, and quality. While the process industry is using advanced technologies such as automation, artificial intelligence, and data analytics to optimize its operations and achieve higher levels of productivity and quality, the calibration industry is lagging behind in terms of digitalization.

To address this issue, a digital calibration certificate (DCC) is being developed to enable electronic storage and sharing of calibration results in an authenticated and encrypted manner. 

The DCC even enables machine-to-machine communication so that calibration results can be transferred directly from the calibration equipment to the relevant systems without the need for manual intervention. 

This may sound futuristic, but even the current Beamex paperless calibration ecosystem works so that a documenting calibrator (such as a Beamex MC6) automatically saves the calibration results digitally in its memory after calibration. From the calibrator’s memory that digital file is then transferred to calibration management software (Beamex LOGiCAL or CMX) for storing and analysis. That calibration results file is still in Beamex's proprietary format.   

The DCC also facilitates sharing calibration data among different stakeholders - for example, external calibration service providers (calibration labs, producers of calibration data) and industrial end-customers (consumers of calibration data). This digitalization and automation reduces the likelihood of errors, improves efficiency and enables almost real-time data integration for improved decision-making.

This would result in more consistent interpretation of the results and improved traceability, as well as enable proper data analytics in process industries and the creation of digital twins for testing and improving processes. This could ultimately lead to increased efficiency, improved safety, cost savings, and new business models.

Beamex has actively participated from the beginning - working alongside Physikalisch-Technische Bundesanstalt (PTB), the national metrology institute of Germany - in creating a globally recognized Digital Calibration Certificate (DCC) format. Our expertise has been instrumental in shaping the DCC standard to meet the specific needs of the process industry. We are preparing to incorporate the DCC into our products to ensure they are future-proofed.

Being entrusted with this significant responsibility by key stakeholders, including PTB, is a true honor. With our extensive experience in delivering digital calibration solutions, we have established ourselves as a crucial player in this field. We take great pride in leading the development of the DCC and remain dedicated to making it applicable and beneficial for the process industry. The recognition and trust from other stakeholders involved in the DCC initiative further reinforces our commitment to this important endeavor.

 

Processes with paper certificates

When a company sends their calibrator or reference standard to an accredited calibration laboratory, they typically receive the equipment back with a paper calibration certificate. This certificate is then stored somewhere or scanned and saved as a file.

If the company wants to run analytics on the certificate, or make a history analysis on several certificates, they need to manually enter each calibration point data into a software to do that. That is because the paper certificate is not standardized neither machine-readable.

In the near future, Digital Calibration Certificates will be delivered as a standardized machine-readable, authenticated, and encrypted files, that can be imported into the company’s system.

 

Digital Calibration Certificate (DCC)

Basically, the DCC is intended to become a globally standardized format for calibration data defined in the form of an XML (Extensible Markup Language) schema.

When a calibration laboratory performs a calibration, it creates the DCC file and adds all calibration-relevant data to the file. This file is then delivered to the customer. When receiving the file, the customer can have it automatically imported into their own system, thanks to the standardized format of the DCC file.

The DCC contains all relevant calibration data, including the date of the calibration, the calibration method used, the measurement uncertainty, and the results of the calibration.

The DCC is created using digital signatures and encryption methods to ensure its authenticity and integrity. It can be accessed and shared online, making it easily accessible for calibration laboratories, manufacturers, and users of measuring instruments.

The main benefits of the DCC include increased transparency, efficiency, and traceability in the calibration process, as well as reduced costs and time.

The DCC is also compatible with international standards and can be used for both national and international calibration requirements.

 

The XML structure of DCC:

The XML structure of a Digital Calibration Certificate DCC

Image copyright by Physikalisch-Technische Bundesanstalt (PTB). All rights reserved.

 

 

The main benefits of the Digital Calibration Certificate (DCC)

 

Here is a short summary of the benefits of the DCC. For the full list, please download the White Paper.

  1. DCC makes it easier to analyze calibration data and create digital twins that help improve efficiency and safety in the process industry.
  2. DCC supports digital transformation by allowing contractors and labs to easily connect to a digitized calibration system with centralized management.
  3. DCC uses a standardized approach to data entry, making it easier to compare and harmonize data from different sources.
  4. DCC makes it easy to manage and search for calibration data and instrument devices, even for large-scale operations.
  5. DCC enables preventive maintenance by alerting when instruments need checking instead of relying on fixed intervals, leading to better risk-based approaches to maintenance and calibration.
  6. DCC increases traceability by replacing inefficient paper-based processes with easy digital search capabilities.
  7. DCC is flexible, allowing customers to use their preferred calibration processes and still generate easily shareable and searchable digital certificates.
  8. DCC is secure, with cryptographic protection to ensure authenticity and data integrity.

Another source discussing the DCC benefits is the Benefits of network effects and interoperability for the digital calibration certificate management by the 2021 IEEE International Workshop on Metrology for Industry 4.0 & IoT.

 

An emerging global standard

Efforts are happening right now to create a global DCC standard. A team of key players, including Beamex, is working together to define requirements, create guidelines and software, and promote awareness and training.

This DCC meets the requirements of DIN EN ISO/IEC 17025:2018-03. The Gemimeg II project is currently leading the way in DCC development, thanks to investments from the German government and involved companies.

Another project, SmartCom, was focused on improving the traceability and handling of metrological data through the creation of DCC with set standards. 

In addition, other projects and initiatives have also been taking place to enable the uptake of DCC to improve the traceability and handling of metrological data.

Such projects include for example EMPIR 17IND02 SmartComEURAMET TC-IM 1448, and Digital NIST.

Together, these initiatives are building a DCC standard that is already being tested in industrial applications.

Due to their key role in the metrology infrastructure, the National Metrology Institutes (NMIs) will also play an important role in ensuring widespread adoption of the DCC standard.

 

Keep calm and adopt the DCC!

The DCC has the potential to transform the process industry. Instead of relying on error-prone and labor-intensive paper-based processes, digital calibration data could be easily searched, shared, and analyzed. This would not only make audits more efficient, but it could also allow data to be used to create digital twins of processes to find efficiency and safety improvements.

At Beamex, we have been digitalizing calibration processes for over 40 years and we see the DCC as a natural extension of these efforts. We believe that cooperation between standards-setting institutions, labs, vendors, and major players in the industry will be needed to make the DCC happen.

That's why we are keen to encourage other industry players to join us in these initiatives and contribute to supporting the implementation of the DCC across industries.

At Beamex we have run several successful proof of concept projects with the DCC and have seen that the DCC is really working in practice.

When you choose Beamex, you are choosing a future-proof solution that is ready to support digitalization efforts and make processes safe, secure, and efficient. Our products are designed to be compatible with whatever DCC standard evolves.

 

Interested in learning more?

If you want to learn more about the DCC or discuss with our experts, please feel free to book a discussion with them:

Discuss with our experts >>

 

In LinkedIn, feel free to connect and discuss with me or with my colleagues with expertise in DCC:

 

Download the full article here:

Digital Calibration Certificate DCC - Beamex white paper

 

 

Relevant material & links

 

Documents describing the basic concept and overall structure of the DCC:

  • S. Hackel, F. Härtig, J. Hornig, and T. Wiedenhöfer. The Digital Calibration Certificate. PTB-Mitteilungen, 127(4):75–81, 2017. DOI: 10.7795/310.20170403.
  • S. Hackel, F. Härtig, T. Schrader, A. Scheibner, J. Loewe, L. Doering, B. Gloger, J. Jagieniak, D. Hutzschenreuter, and G. Söylev-Öktem. The fundamental architecture of the DCC. Measurement: Sensors, 18:100354, December 2021. DOI: 10.1016/j.measen.2021.100354.

Additional information on the technical aspects of DCC can also be found on the PTB’s Digital Calibration Certificate Wiki: Digital Calibration Certificate (DCC) - Wiki | Digital Calibration Certificate - Wiki (ptb.de)

Further reading on the potential and benefits of the DCC in a calibration ecosystem:

  • J. Nummiluikki, T. Mustapää, K. Hietala, and R. Viitala. Benefits of network effects and interoperability for the digital calibration certificate management. 2021 IEEE International Workshop on Metrology for Industry 4.0 & IoT. DOI: 10.1109/MetroInd4.0IoT51437.2021.9488562.
  • J. Nummiluikki, S. Saxholm, A Kärkkäinen, S. Koskinen. Digital Calibration Certificate in an Industrial Application, Acta IMEKO Vol. 12, No. 1 (2023). DOI: 10.21014/actaimeko.v12i1.1402.

 

 

Related blogs

If you liked this article, you might like these ones too:

 

Topics: Calibration process, Digitalization

CMMS calibration module or dedicated calibration software?

Posted by Heikki Laurila on Apr 26, 2023
CMMS-and-calibration-software

When your computerized maintenance management system (CMMS) already has a calibration module, why would you buy dedicated calibration software?

It’s a fair question and one that we frequently get asked! The reasons can vary, depending on the application. Are you maybe comparing apples to oranges?

There are different kinds of dedicated calibration software products out there, each with somewhat different functionalities. Although they have the same name, they are all different in one way or another.

Does integrating dedicated calibration software with your CMMS bring you the best of both worlds, or just a big mess?

In this article we look at the various setups and compare these different scenarios.

If this sounds interesting, please keep on reading.

 

Table of contents

 

CMMS and calibration

CMMS, asset management systems, and enterprise resource planning (ERP) systems include a variety of different functionalities and are implemented for a certain purpose. They are not designed specifically for calibration management. Although they have some calibration functionality, this can be quite limited.

Sure, there can be an add-on calibration module with basic functionality for calibration management, but these kinds of systems do not have the same level of sophistication as dedicated calibration software designed specifically for the purpose of calibration.

Sometimes these add-ons still require manual data entry methods such as a pen and paper to document calibrations! C’mon, this is the 21st century!

 

The problem with pen and paper

With digitalization becoming the norm in industry, you could be forgiven for thinking that calibration is already taken care of by the calibration module of your CMMS. But, as mentioned above, calibration results may still need to be documented manually using pen and paper. The papers are then archived, or the calibration data is subjected to another error-prone manual step – entering it into the calibration module using a computer keyboard.

In the worst-case scenario the calibration data is not stored digitally in the CMMS at all and may simply be scanned. This brings further limitations as you can’t analyze any data from a scanned document.

This is also the case if the data is stored in a paper archive. For example, you can’t easily check the detailed results of the previous calibrations performed. Also, it’s very difficult to find data for regulatory audits. This process also brings with it all the data quality and integrity issues related to manual data entry. The errors within manually completed files don’t disappear if you scan them or manually transcribe the results from the paper to the CMMS, which as mentioned above, can introduce further errors.

Learn more about the consequences of manual data entry is this blog: Manual Data Entry Errors

 

Also, we need to consider reverse traceability. This means that if a reference standard (calibrator) is found to be out of specifications during a calibration, you need to investigate where that reference standard has been used. It may have been used for tens or even hundreds of calibrations, and as a result these may all be considered suspect. If all your calibration certificates are in paper format or scanned, it is extremely laborious and time-consuming to go through them to perform reverse traceability. Advanced calibration software would allow you to generate a reverse traceability report at the touch of a button.

Beyond data analysis – or the lack of it if you’re using paper files or scanned documents – there are other, often overlooked, ways in which dedicated calibration management software can help your business shine.

  1. Sustainability – You might have invested significant time and money in initiatives to create more sustainable working practices, but have you thought about how calibration can make your business more sustainable? A robust calibration process using dedicated software improves efficiency, eliminates paper, and can even extend the lifespan of your equipment.
  2. Employee wellbeing – Making calibration tasks simpler and less stressful for your technicians can make a huge difference to their wellbeing and can even mark you out as an employer of choice in what is an extremely competitive labor market.
  3. Product quality – The downstream impact of data integrity or other issues within the calibration processes can compromise the quality of your products. Dedicated calibration software helps to avoid this problem by maintaining data integrity.
  4. Safety – If you’re making a product that is consumed, for example food or medicine, dedicated calibration management software can give you greater confidence that your product is safe because you can rely on the fact that your in-process measurements are accurate. This is particularly important in cases where a product cannot be destructively tested to confirm it is safe.

 

CMMS vs. dedicated calibration management software

Let’s take a more detailed look at how the calibration module in a CMMS stacks up against dedicated calibration management software such as  Beamex CMX.

  1. Functionality: Compared to a CMMS module, dedicated calibration management software typically offers more advanced functionality for managing calibration procedures, such as automated calibration scheduling, calibration task management, guided instructions, reference management, calibration uncertainty calculations, reporting, and more.
  1. Customization: Dedicated calibration management software is typically highly customizable, meaning you can configure it to your specific calibration needs. This can include creating custom calibration procedures, configuring workflows, and integrating the software with other systems. A calibration module in a CMMS typically is more limited in terms of customization options. If you do want to customize your CMMS module with additional calibration functionality, it will be costly to implement and maintain. What’s more, you might not even know what kind of functionality needs to be added. Dedicated software from a reputable provider will take into account the current and future requirements of a large customer base and leverage emerging future technologies, adding new features and functionalities as part of regular updates.
  1. Integration: While both types of software can integrate with other systems, dedicated calibration management software may offer more seamless integration with other laboratory or process control systems, such as electronic documentation management systems, laboratory information management systems (LIMS), or ERP systems. A CMMS calibration module may only offer limited integration options.
  1. User interface: Dedicated calibration management software typically offers a user-friendly interface specifically designed for managing calibration processes, which can help to streamline workflows and improve user productivity. A calibration module in a CMMS on the other hand may have a more general user interface that is designed to support a range of maintenance management tasks.
  2. Cost: Dedicated calibration management software may be more expensive than a calibration module in a CMMS as it offers more advanced functionality and customization options. However, you should find that the additional cost is justified by the improved functionality and productivity gains that dedicated software offers. 

 

 

Calibration software - manual or automatic?

Not all products that are called calibration management software solutions are the same or offer the same functionalities.

The two main different categories are calibration software where data is entered into the system manually and software that communicates with documenting calibration tools. Let’s look at these two categories in more detail.

 

1. Calibration software with manual data entry

With these types of systems, you input the data manually with a keyboard. If you don’t carry a laptop with you in the field, then you need to manually document data during the calibration and then input it into the system when you’re back in the office – meaning there are two manual steps in your calibration process!

While this kind of calibration software may offer a lot of functionality once you have the results stored digitally in the system database, including data analysis, the original source data may have gone through multiple manual entry steps before ending up in the system. So, the data may have accidental (or even intentional) errors, and the data quality and integrity could be questionable.

Analyzing non-reliable data is a waste of time and may even be misleading, leading to wrong decisions. “Crap in, crap out”, as they say.

So, in the end using this kind of calibration software is not much better than using a CMMS calibration module.

Learn more about the consequences of manual data entry is this blog: Manual Data Entry Errors

 

2. Calibration software that communicates with documenting calibrators

With this kind of software there is no need for any manual data entry during the calibration process. Your calibration tools automatically store the calibration data digitally during the calibration. This eliminates the risk of manual error and means that the data cannot be tampered with. The calibrator can even be configured to require an electronic signature from the person who performed the calibration. This is important in highly regulated industries such as the pharmaceutical industry, where data integrity is vital.

Learn more about: Data Integrity in calibration processes, or about Common Data Integrity pitfalls in calibration processes.

 

documenting calibrator may even be able to perform the calibration fully automatically, saving time and ensuring a repeatable calibration process every time. Once the calibration is complete and the data is stored in the calibrator, the results can be transferred from the calibrator’s memory to the calibration software, again fully digitally.

Advanced documenting calibrators can also perform an automatic pass or fail decision straight after the calibration. This may sound like a small thing, but what about if you have a square rooting pressure transmitter ranging from -0.1 to 0.6 bar and you get an output of 12.55 mA at 0.1 bar input pressure, while your error limit is 0.5 % of span – does that sound like a pass or fail?

It’s not always easy to calculate in your head – or even with calculator. Sure, if you have a 0 to 100 °C temperature transmitter and the error limit is ± 0.5 °C, it is very easy. A smart documenting calibrator like a Beamex documenting calibrator will automatically tell you if each point is a pass or fail.

Using this kind of calibration management software together with documenting calibrators offers significant advantages over basic calibration software or your CMMS’s calibration module. It will save you a lot of time in calibration, and ensures you have high-quality data available for analysis.

But even with the most advanced calibration software, it’s important to remember that there is still some manual work to do in the process, starting with generating the work order in your CMMS and finally closing it.

But don’t worry, there is a better way to do that too. You can also digitalize and automate this step in the process if you integrate your CMMS and your calibration software. More on that next.

 

Integration – the best of both worlds!

Creating an end-to-end digital flow of calibration data throughout your business is easily achievable by integrating your CMMS with advanced calibration management software, such a Beamex CMX, that can communicate with documenting calibrators.

Many of our customers have found a better way with this kind of integration.

In practice, in the simplest case this integration works like this: work orders are generated in the CMMS and automatically sent to your calibration management software, and when the calibration is complete in the software, it automatically notifies the CMMS to close the work order.

Read more on why integrate and how integration can automate your work order and calibration results handling on our website >>

 

What our customers say on integration

Jody-Damron-SRP-v2"With this software integration project, we were able to realize a significant return on investment during the first unit overhaul. It’s unusual, since ROI on software projects is usually nonexistent at first."

Jody Damron, Business Analyst, Salt River Project, US

Read the full Salt River Project case story >>

 

Beamex calibration ecosystem

beamex calibration ecosystem

Beamex calibration ecosystem is a comprehensive solution for calibration management that includes various hardware and software tools designed to help industries achieve better quality and reliability in their production processes. It consists of three main components: calibration software, calibration equipment, and calibration services.

The calibration software provides a user-friendly interface for managing calibration procedures, storing calibration data, and generating reports. It allows for customizable workflows, automated documentation, and integration with other systems.

The calibration hardware includes portable calibrators, bench calibrators, and pressure controllers that are designed to perform accurate and reliable calibrations in the field or laboratory. These devices are easy to use and feature advanced functions such as automated calibration, data logging, and wireless communication.

Calibration services are also offered by Beamex, which include on-site calibration, instrument maintenance, and training. The services are provided by qualified technicians who are experts in their field and can provide tailored solutions to meet the specific needs of each customer.

Overall, the Beamex calibration ecosystem provides a complete solution for calibration management that can help industries improve their processes, reduce downtime, and comply with regulatory requirements.

Learn more about the Beamex calibration ecosystem on our website >>

Talk with Beamex experts to find the best solution for you >>

 

Related blog posts

If you liked this post, you might also like these:

 

Finally, I want to thank my colleague Aidan Farrelly for his great LinkedIn post that sparked the idea for this post. Also, thanks Aidan for your comments while I was writing this.

Thanks,

 

Topics: Calibration process, Calibration software, Calibration management, Digitalization

The most common calibration-related FDA warnings to pharma companies

Posted by Heikki Laurila on Feb 13, 2023

fda-warning-pharma

 

As consumers of the products from pharmaceutical companies, we all want to be sure that we can trust the products they produce. Fortunately, pharmaceutical companies are heavily regulated, and their operations are being constantly monitored. The US Food and Drug Administration (FDA) is the main authority that regularly audits pharma companies globally to ensure they are compliant. 

Every year, the FDA gives hundreds of warnings to pharma companies that fail to comply with regulations. So even they are not perfect! Most of these warnings are not related to calibration, but some are. I analyzed the warnings related to calibration and in this article, I list the most common calibration-related warnings issued. If that sounds interesting, please continue reading.

Table of contents

I recommend browsing through the whole article, but you may also jump straight to the underlined main topic:

 

The FDA in brief

The FDA (https://www.fda.gov/) is a US government agency protecting public health by ensuring the safety, effectiveness, and security of drugs, biological products, and medical devices.  

The FDA audits pharmaceutical companies and ensures that they operate according to the relevant regulations. Although the FDA is a US authority, any pharma company that wants to sell products in the US needs to comply with FDA regulations and will be subject to their audit processes. Therefore, in practice the FDA requirements are global.

If the FDA find evidence of non-compliance during the audits, they will issue a form 483, and it may lead to a warning letter. The FDA even has the power to shut down a plant if they find serious non-compliance. 

So, it’s pretty obvious that the pharmaceutical companies are on their toes when an FDA auditor arrives.

In addition to the FDA, there are other authorities auditing pharma companies, including the European Medicines Agency (EMA) and the UK’s Medicines & Healthcare Products Regulatory Agency (MHRA), plus national agencies in other countries. So, pharmaceutical companies are  audited regularly by other authorities in addition to the FDA. 

Hopefully, the Multilateral Recognition Agreements (MRA) between the authorities keep developing so that pharma companies are not subject to multiple audits by different authorities.

 

Warning letter

As mentioned, when an FDA auditor investigates a pharmaceutical company, the observations of non-compliance will be written into a 483 form. Depending on their criticality and/or number of the observations, it can lead to a warning letter. The pharmaceutical company then has to provide a response to the letter and perform sufficient corrective actions to correct the observations.

These warnings are publicly available and they can be found on the FDA website from this link: https://www.fda.gov/inspections-compliance-enforcement-and-criminal-investigations/compliance-actions-and-activities/warning-letters 

There are currently almost 3,000 warning letters listed on the FDA site, dating from 2019 until now. 

For the last three years, there have been around 600 warning letters issued annually, distributed as shown in the below image:

Warning letters per year

 

The most common general warnings 

As there are so many letters, it gets complicated to analyze them all, but generally available information lists the most common reasons for warnings being:

  • Absence of written procedure, or not following written procedures
  • Data records and data integrity issues 
  • Manufacturing process validation – Lack of manufacturing controls
  • False or misleading advertising
  • Issues with environmental monitoring

Many are generally referred as “Failure to meet Good Manufacturing Practices”.


The most common calibration-related warning letters

This article is not about all the warning letters, only the ones that are somehow related to calibration. Needless to say, I am mostly interested in calibration-related topics 😊 

So, I investigated all the warnings for the last three years: 2020, 2021, and 2022. There have been almost 2,000 warning letters issued during those three years!

If we look at how many warning letters are related to calibration, we can see that it is actually quite a modest share of the warnings. It seems that about 2% of the warning letters include comments on calibration.

While most of these companies are located in the US there are also some from South America, Europe, and Asia.

Obviously, some of the generic warnings can also have calibration included, although not separately mentioned. These include maintenance-related issues, written procedures missing, and issues with data records and manufacturing controls, to mention a few. 

I analyzed all the warning letters that have calibration mentioned in them. Let’s look at what kind of calibration-related topics seem to be the most common.

I grouped the calibration-related warnings into the following categories and their share is mentioned in the below list:

  • Inadequate calibration program: 33%
  • Failed to routinely calibrate instruments: 19%
  • Lack of calibration records: 16%
  • Use of non-calibrated calibration instruments: 11%
  • Insufficient calibration to prove required accuracy: 5%
  • Test equipment not calibrated for the required range: 5%
  • All others: 11%

 

The image below illustrates the most common calibration-related warnings.

Calibration related warning letters

 

The top three reasons for calibration-related warnings – and how to avoid them

Let’s look at the top three reasons and how to avoid them.

1. Inadequate calibration program

As we can see, the most common reason is “Inadequate calibration program”, which accounts for a third (33%) of the cases. In these cases, the company has not had an adequate calibration program that would document how and when each instrument should be calibrated. 
Creating and documenting a calibration program is a fundamental job for calibration-responsible staff. It naturally gets a bit more demanding to make sure it is “adequate” for the FDA auditor and fulfils all the relevant FDA regulations.


2. Failed to routinely calibrate instruments

The second most common reason is “Failed to routinely calibrate instruments”, with 19% of the cases. In these cases, the company has simply not calibrated all instruments as they should have done.
The best cure for this one is to make sure you have an automated system that alerts you when instruments should be calibrated and ensures that all calibrations are done.


3. Lack of calibration records

The third most common reason is “Lack of calibration records”, with 16% of the cases. This means the company has no evidence that calibration is being done. This one is quite similar to the previous type of case, but in these cases the company has somehow been able to convince the auditor that they have done the calibration but they don’t have records to prove it, such as calibration certificates.
The cure for this one is to make sure your calibration system stores all calibrations digitally so you can pull out the records easily any time an auditor wants to see them.

 

How Beamex can help

We have worked and partnered with many of the top pharmaceutical companies for a long time. Many of the world’s leading pharmaceutical and life sciences companies rely on the Beamex calibration ecosystem, which is designed to help customers achieve their calibration-related goals in line with regulations, like those from the FDA.

A lot of functionality developed for our calibration ecosystem has been to meet the requirements of pharmaceutical companies. 

For pharmaceutical companies we offer, for example:

  • Calibration management software with many features developed especially for pharmaceutical companies 
  • Calibration equipment for field and workshop calibration
  • Various expert services tailored for pharmaceutical companies

If you are in the pharma business, contact us to learn more about how we can help you to meet the calibration-related FDA requirements: 

Contact Beamex pharma calibration experts!

 

Related reading

If you found this article interesting, you may also like these:

View Beamex pharmaceutical industry page >>

 

Download your copy of the Calibration Essentials Software eBook to learn more about calibration management and software:

Calibration Essentials- Software eBook

 

 

Topics: Calibration in pharmaceutical industry

Working workshop wonders with Endress+Hauser [Case Story]

Posted by Tiffany Rankin on Oct 19, 2022

 

Equipping a full-service calibration workshop for Endress+Hauser in Canada

A customized Beamex solution including hardware and software sits at the heart of Endress+Hauser’s state-of-the-art Customer Experience Centre in Ontario. In this short blog we explore the critical role Beamex solutions are playing as part of the center’s full-service calibration laboratory.

Endress+Hauser is a global leader in measurement instrumentation, services, and solutions for industrial process engineering customers from a wide variety of industries including life sciences, mining, chemicals, food and beverage, and water and wastewater treatment.

Read the story in full >>

 

The best of the best in calibration technologies

In 2021 Endress+Hauser Canada opened its state-of-the-art Customer Experience Centre in Burlington, Ontario. The center is home to a full-service calibration laboratory that brings together the best of the best in calibration technologies, reflecting Endress+Hauser’s own exacting standards.

The company has been collaborating with Beamex since 2015, so when it came to equipping the new laboratory, Beamex was the natural choice to provide the necessary high-performance calibration equipment.

The Beamex Canada team worked with colleagues at the Beamex HQ in Finland to design a custom solution for Endress+Hauser Canada comprising:

 

Here's a very short video summary of the story:

 

Speed and efficiency get a boost

Endress+Hauser Canada’s calibration needs typically involve pressure and temperature calibrations. With the Beamex solution enabling fully automated calibration, pressure calibrations take just 30 minutes instead of 45 minutes or even an hour. This means more calibrations can be done in the same amount of time and frees up technicians to work on other tasks while calibrations are being performed.

Martin Bedard, Calibration and Program Supervisor for Endress+Hauser Canada: “The quality of Beamex equipment is higher than that of the competition, and the customer service is also very good. The Care Plans give us a great customer experience, with a turnaround time to Finland of just five to seven days, which is often faster than using local laboratories.”

 

Download the full customer success story

 

220429-Endress_Hauser-405

 

Related content

 

Topics: Workshop calibration, Calibration, Calibration process, Case Story

Ensuring sustainable management of process safety for chemicals

Posted by Monica Kruger on Sep 22, 2022

Ensuring sustainable management of process safety for chemicals

In the chemicals industry, safety is priority number one. But how do you ensure safety in a sustainable way? When it comes to calibration, the answer is a modern, digitalized, and automated solution.

There’s a reason safety is so important in the chemicals industry. If something goes wrong, it’s not just an issue for the plant and its employees – it can also impact people living in the surrounding area. This is one of the reasons that chemicals are so strictly regulated. 

Chemical plants need to maintain strict quality management and hold detailed product information. Chemical process companies must be able to capture data from operational processes accurately in order to be prepared for product recalls. In case of audits, all of this data must be easy to find.

Learn more by downloading the full article here:

Improving efficiency in the chemical industry

 

How automation helps

This is where automated and digitalized calibration solutions come into play. All of the instruments that are part of this safety process need to be accurately calibrated to ensure they’re working properly. However, in many plants this process still relies on paper certificates. While paper may feel reliable and tangible, there is a substantial risk of human error in the process. Each calibration typically has 20 data points or more, so even if the error rate for writing down results is only 1%, this means one in every five certificates is likely to contain faulty data. 

With automated calibration, results are captured automatically in a digital format and sent securely to the calibration management system. This gives 100% accurate, tamper-proof results. Even better, finding certificates is as simple as performing an online search instead of manually looking through mountains of binders full of paper.

 

A repeatable process brings sustainability

Another advantage of automated calibration is repeatability, which improves business sustainability. One challenge chemical plants face is the changing skill sets of the technicians who perform the calibrations. When this is combined with out-of-date test and measurement equipment, there is a genuine risk of instruments drifting out of tolerance.

Automated calibration helps solve this problem. Instead of varying in quality, every calibration is performed to the same highly accurate level as the calibrators can offer step-by-step guidance to technicians. The process is also faster – by cutting manual steps such as the need to fill in paper certificates or enter results into a database at the office, technicians can save 50% of the time needed for calibrations.

 

Ensuring safety

The repeatability, reliability, and accuracy of automated calibration also means better safety. This is because chemical plants can be sure that instruments critical for process safety are reliably calibrated and within tolerance. Well-designed calibration systems also enable technicians to include checklists as part of the calibration procedure – which is critical in ATEX environments and other scenarios where very clear procedures need to be followed to ensure safety.

 

Beamex calibration ecosystem

Beamex offers an automated calibration ecosystem that is composed of a unique mix of expertise and technology to help chemical companies sustainably improve process safety. Beamex solutions provide accurate measurements, reliable data, and traceability, which helps to reduce uncertainty and errors in calibration data. The end result is consistent calibration quality at your chemical plants.

Learn more about Beamex products and services on our website or contact your local Beamex representative:

 

Visit Beamex website

Contact Us

Beamex Worldwide Contacts

 

Related chemical articles

 

Related blog posts

Download your copy of the Calibration Essentials Software eBook to learn more about calibration management and software.

Calibration Essentials- Software eBook

Topics: Calibration software, Calibration management, Calibration in chemical industry

What is operational excellence and how can calibration help achieve it?

Posted by Heikki Laurila on Aug 31, 2022

What is operational excellence?

In this article, we’re going to take a closer look at a topic that’s talked about a lot but isn’t always that clear: operational excellence. We’ll briefly discuss the history of the concept, what it means in practice, and how it applies to process industries – including the many benefits it can unlock. We’ll also set out how calibration can play a role in enabling operational excellence in your process industry plants. Read on to find out more.

 

Table of contents

 

What is operational excellence?

As defined by the Institute for Operational Excellence, operational excellence is achieved when “each and every employee can see the flow of value to the customer and fix that flow before it breaks down.” What this means in practice is that a company with operational excellence at its core is able to provide the best possible value to their customers. To do this, the focus is on the quality of the product or service and the process for creating and delivering it to the customer.

Operational excellence applies to every level of an organization and empowers people at all levels to make changes to ensure the proper process flow is continuously improved and does not break down. This leads to better execution of a company’s strategy, unlocking the benefits of operational excellence – including improved quality, efficiency, and revenue.

 

A brief history of operational excellence

The model of operational excellence was created by Dr. Joseph M. Juran in the 1970s when teaching Japanese business leaders about how to improve quality. His methods were further expanded in the 1980s in the US in response to the “quality crisis” where US companies were losing market share to Japanese companies.

The concept has continued to develop and now operational excellence encompasses methodologies like lean manufacturing and Six Sigma to help bring about the desired state of value flow with a focus on quality.

 

Operational excellence principles

There are several approaches that can be used to achieve operational excellence. We will look at two of the main ones, the Juran model and the Shingo model, as they both offer useful insights. The Juran model (named after the creator of operational excellence) has five components that help build operational excellence in an organization. These are:

  • Understanding the guiding principles that lay the foundation for excellence, which includes embracing quality
  • Improving the customer experience
  • Creating an infrastructure that engages employees to make improvements by using the right methods and tools
  • Creating process improvement teams to drive process efficiency
  • Ensuring leadership and workforce engagement

 

The Shingo model, created by Dr. Shigeo Shingo (a Japanese engineer who is considered one of the foremost experts on manufacturing practices), is based on ten core principles. These are:

  • Principle #1: Respect every individual – employees at all level of an organization should feel empowered to make changes and improvements.
  • Principle #2: Lead with humility – leaders should inspire employees to undertake and execute critical tasks.
  • Principle #3: Seek perfection – even though it’s not possible to be perfect, the organization should always strive to improve in order to avoid complacency.
  • Principle #4: Embrace scientific thinking – the scientific method should be used to examine and improve operational processes.
  • Principle #5: Focus on process – process is the key to creating value flow from company to customer; by focusing on the process you can see if it is performing as it should be.
  • Principle #6: Assure quality at the source – quality is the key focus, and should be an integral part of all activities.
  • Principle #7: Improve flow and pull – companies can maximize the flow of value through efficient processes that minimize waste.
  • Principle #8: Think systematically – the entire system should be seen as one flow where all departments are working together to create customer value.
  • Principle #9: Create constancy of purpose though clear company goals and a vision with a clear target.
  • Principle #10: Create value for the customer – this is the key takeaway. The business exists to bring value to the customer.

 

These ten principles underly the four key areas needed to enable operational excellence in an organization, including cultural enablerscontinuous improvemententerprise alignment, and results.

 

Operational excellence methodologies

The methodologies for achieving operational excellence include lean manufacturing, Six Sigma, and kaizen.

The core idea behind lean manufacturing is cutting waste, resulting in more efficient processes. Doing so requires the following steps to be taken:

  • Specifying the value wanted by the customer
  • Identifying the value stream for each product and finding wasted steps
  • Creating continuous flow for the product along all steps
  • Introducing pull between all the steps to enable flow
  • Striving for perfection to reduce the number of steps needed and how long each step takes

 Lean thinking, Womack and Jones, 2003

 

Six Sigma, which was first introduced at Motorola, aims to improve quality by identifying areas where defects may occur in a process and removing them. This is done through systematic quality management. Lean manufacturing and Six Sigma have also been combined to create Lean Six Sigma, which combines focuses on process, flow, and waste into one system.

Kaizen, often translated as “continuous improvement”, is a Japanese methodology focused on making continual incremental changes with the goal of improving processes. Improvements can be suggested and implemented by any employee across the organization. The basic idea is that no process is ever perfect and thus can always be improved by making gradual changes.

All of these methodologies have a focus on quality and process while eliminating waste, helping to create operational excellence in an organization.

 

The benefits of operational excellence 

The benefits of achieving operational excellence are many. The number one benefit is that it enables an organization to achieve concrete business results more quickly. This is because employees at all levels of an organization are able to make decisions and execute changes that result in better value flow to the customer – speeding up improvements and ensuring constant creation of value. An operational excellence mindset, with its focus on flow and value, can also lead to better quality, efficiency, on-time delivery, and overall profitability.

 

Best practices and how to achieve operational excellence

Achieving operational excellence is a multistep process that requires effort from all levels of an organization. 

  •  Having and communicating a clear strategy that is based on goals and key performance indicators is critical. 
  •  Choosing and implementing the right methodology for your goals – such as lean manufacturing, Six Sigma, or Lean Six Sigma – helps to ensure your focus is on quality and reducing waste. 
  •  Training and education is needed to help employees understand their role in achieving operational excellence. 

 

Working with an expert who understands operational excellence and how to roll it out can also be helpful, as is looking at successful industry case studies. Some major companies using operational excellence are: 

 

What does operational excellence mean for process industries?

The focus on quality and flow that operational excellence unlocks is absolutely critical for process industries. After all, process industries have to manufacture products for customers to exacting quality standards. Efficiency of operations, along with safety, is also key. By helping to ensure quality and efficiency with a smooth flow of value across all processes, operational excellence helps production plants to be more profitable and resilient.

The benefits of operational excellence for process industries include:

  • Better process efficiency from fewer steps and less waste
  • Better profitability through lower expenses
  • More consistent production through a focus on quality
  • More resilient plants from optimized processes
  • A decreased risk of shutdown from optimized processes

 

 

How calibration can help enable operational excellence

In process industries, calibration plays an important role in operational excellence. A good calibration process ensures processes work as designed and plays an important role in ensuring the quality of the end product. The efficiency of the calibration process is an important element of overall operational efficiency and greatly depends on the type of calibration process.

 

What is calibration?

Before discussing how calibration can contribute to improved operational excellence, let’s very briefly summarize what calibration is. Calibration is a documented comparison of the device to be calibrated against an accurate traceable reference device (often referred to as a calibrator). The documentation is commonly in the form of a calibration certificate.

Unbroken and valid traceability to the appropriate national standards is important to ensure that the calibration is valid. As each calibration is only valid for a limited period, regular recalibration of all the standards in the traceability chain is required.

It is vital to know the uncertainty in the calibration process in order to be able to judge if the calibration result was within set tolerance limits and if it was a pass or fail. Learn more about what is calibration.

 

Reasons for calibrating

Aside from enabling operational excellence, there are various reasons to perform calibration. All measurement instruments drift over time, meaning their accuracy deteriorates and regular calibrations are required. In the process industry, this fact is directly linked to the quality of the end product. In many industries, such as the pharmaceutical industry, regulatory requirements set tight rules for the calibration of critical process instruments. Likewise, quality systems set requirements for calibration.

As with many other things, money is also an important reason. In many cases money transfer depends on measurements, so the accuracy of the measurements directly effects how much money is transferred. In some processes, the safety of both the factory and its employees, as well as that of customers or patients who use the end product, can be the main driver for calibration.

 

Calibration interval

To maintain the traceability of all your process measurements, a valid unbroken traceability chain needs to be maintained. This means regular recalibrations at all levels of the traceability chain –  not only all the process measurement instruments, but also the working standards and reference standards (or calibrators).

Finding the proper calibration interval is important. If you calibrate too often, you end up wasting resources. But if you calibrate too infrequently, you face the risk that instruments will drift outside of set tolerances – and in many cases that can have serious consequences.

This means companies are constantly balancing risk against wasted resources. A proper analysis of calibration history and calibration interval is key, and finding the right sweet spot helps to contribute to operational efficiency.

 

Digitalizing, streamlining and automating the calibration process – finding a better way

When we realize calibration’s role in operational excellence, we understand the importance of making calibration processes more efficient – how can we produce less waste and do more with less?

At many industry sites, there are thousands and thousands of calibrations carried out annually. To streamline those processes and save time with every calibration can save a huge amount of money and have a big impact on the bottom line.

One of the main opportunities for time saving is to ditch manual calibration processes – typing or using pen and paper to document things – and instead move to a modern digitalized world where the calibrator automatically stores the calibration results in its memory, from where they can be digitally uploaded to calibration management software. Not only does this digitalized and paperless calibration process save a lot of time, it also eliminates all the errors related to manual data entry. Digitalization also dramatically improves the quality of calibration data. And given that analysis and decisions are based on data, it’s clear that data should be of high quality.

The streamlining of calibration processes with the help of digitalization is one major contributor to their operational excellence. As with any processes, when working to improve operational excellence there is a constant quest to find better ways of doing things. If the calibration processes are very outdated, relying on manual documentation and lacking automation, then it’s possible to make a major leap in excellence by moving to digitalized and automated processes. After that is done, the next step is to constantly find small improvements.

Finding the best practices and consistence in calibration processes and methods is important. You should constantly work to evolve and improve these methods over time. This is even more important and have bigger impact in big multi-plant organizations keeping processes uniform.

Make sure you leverage automation in calibration whenever possible, that is a great way to improve efficiencyConsistent automated processes will also improve the quality of data by eliminating the risks for human errors. It will also make it quicker and easier for new employees to get up to speed with higher quality of work.

 

Conclusion 

In summary, operational excellence is an organizational mindset based on set principles and methodologies that aims to improve the flow of value to a customer. A focus on quality and eliminating waste results in greater efficiency and profitability in process industries. Calibration can help unlock operational excellence by moving to a modern digitalized process that reduces the time needed for calibrations and improves data quality. Better data can be analyzed to find further efficiency improvements, not just for the calibration process but also for production plant processes. The end result is improved operational excellence for process industries.

 

 

Experience a better way for your busienss with Beamex

Beamex offers a calibration ecosystem that is a unique combination of calibration technology and expertise to help improving efficiency, ensuring compliance, increasing safety in operations and improving the operation excellence.

Please contact us and let our expert help you to find you a calibration system that helps to improve your operation excellence:

Contact Us

 

 

You might also like

 

Download your copy of the Calibration Essentials Software eBook to learn more about calibration management and software.

Calibration Essentials- Software eBook

 

 

Topics: Calibration, Calibration process, Digitalization, Operational Excellence

How automated calibration can help service companies be more competitive

Posted by Monica Kruger on Jul 06, 2022

How automated calibration can help service companies be more competitive

Service companies that perform calibrations for the process industries operate in a challenging environment. Not only is there a lot of competition, but contracts for customers are based on estimates, meaning that every additional hour of work directly affects the bottom line. Finding and retaining skilled calibration technicians is also a challenge.

So, what can service companies do to make their quotations more accurate and ensure work is carried out as consistently and efficiently as possible? The answer is to automate the calibration process.

 

The problems with pen-and-paper calibration

To see why automation helps, first we need to look at the way calibrations are currently conducted using pen and paper. Paper-based calibrations are time consuming, with 40–50 data points needing to be filled in by hand for each calibration. Because it relies on manual data entry, paper-based calibration is also prone to errors. It’s commonly accepted that the typical error rate in manual data entry is around 1%. While this might not sound like a lot, it can have major implications for the accuracy of the calibration process. The end result of manual processes is that every second calibration certificate might possibly contain an error.

Paper certificates also negatively affect transparency – when using them, it’s hard to share calibration results with end customers in a timely fashion. Paper certificates also require warehousing and are not easy to find when an audit is required – let alone if the client wants to use the calibration data improve their process efficiency through trend analysis.

 

You can go paperless today

Beamex’s automated calibration solution combines softwarehardware, and calibration expertise to deliver an automated, paperless flow of calibration data with a minimal requirement for manual data entry. The major benefit here is that an automated process cuts the number of steps involved in the calibration process, potentially saving up to 50% of the time it takes. Even shaving just 15 minutes off the time needed to perform a calibration, plus an additional 15 minutes due to not having to manually enter results into a database, adds up to huge time savings.

In addition to saving time and enabling a more efficient process, automated calibration helps to avoid mistakes typically associated with manual data entry – thus improving the quality and integrity of the calibration data and making sure your customers are happy with the work you’re doing for them.

Modern multifunction calibrators from Beamex also provide user guidance so that even less experienced technicians can carry out calibrations quickly and reliably. Because the process is highly repeatable, making quotations becomes easier as you will know how much time is needed for each calibration.

Finally, with automated calibration you can offer your customers new services based on data analysis. Because all the calibration data is in a digital format and easily searchable, you can analyze your customers’ calibration processes and data to provide improvement recommendations – differentiating your service company offering.

 

Example of ROI calculation

The average cost to a service company for an instrument technician is around €50 per hour, including salary, benefits, overheads, and so forth.

If a technician carries out 2,000 calibrations a year and it takes them on average 15 minutes to write up a calibration certificate for each calibration, then writing certificates costs a service company €25,000 per year per technician.

Assuming it takes another 15 minutes to manually enter that data into the database, then entering data costs another €25,000 per year per technician.

Automating this process would save 1000 hours of work per year per technician and result in significant cost savings.

 

automated calibration solution for service companies

 

Customer success testimonials

"Beamex's calibration solution is an ideal match with our needs as a service company. We now have a paperless process that increases our technicians’ productivity without sacrificing accuracy, so we can provide a leaner, more efficient service and our clients can expect certificates as soon as the work is completed. The support from Beamex means we can rely on everything to work as expected and provide our customers with the best possible service."  

Richard O’ Meara Contracts Manager, Douglas Calibration Services, Jones Engineering Group

 

"The Beamex Integrated Calibration Solution has allowed us to save up to 30% of the time spent on calibrations and the production of verification reports, while also giving us the option of editing standardized and personalized verification report frames to meet the specific requirements of our customers. By automating calibration routines, our technicians can focus on other tasks while still following procedures and ensuring the integrity of calibration results. Therefore, the technicians are more relaxed and our customers are more confident, especially since the process is fully digitalized and there is no risk of errors during data collection."

Laurent Flachard, Lifecycle Field Leader, Emerson, France

 

 

Read more about the benefits of automated calibration for service companies in our white paper. Download your PDF copy here:

How service companies can create a competatice advantage - Beamex White Paper

 

Beamex calibration solutions

Learn more about Beamex products and services on our website or contact your local Beamex representative:

 

Visit Beamex website

Contact Us

Beamex Worldwide Contacts

 

Related articles

If you found this post interesting, you might also like these service company-related articles:

 

Related blog posts

Other blog posts we would like to suggest to you: 

 

Download your copy of the Calibration Essentials Software eBook to learn more about calibration management and software

Calibration Essentials- Software eBook

 

Topics: Calibration, Calibration process, Calibration software, Calibration management

Stepping on the gas with the UK’s National Grid[Case Story]

Posted by Rita Patel on May 30, 2022

Stepping on the gas with the UK’s National Grid - Beamex blog post

 

Beamex’s automated, paperless calibration solution has helped National Grid streamline its calibration processes and save a huge amount of time and money in the process. In this blog we take a look at what can be achieved with a combination of the right hardware, software, and expertise.

Read the story in full >>

 

National Grid owns and operates Great Britain’s gas national transmission system (NTS). A critical part of this network are the 25 gas compressor stations, from where gas is fed to the eight distribution networks that supply domestic and industrial consumers across Britain. The volume of calibration data these stations generate is huge, with everything from pressure and temperature switches to flow transmitters and vibration sensors requiring regular calibration to ensure accuracy and reliability. 

But National Grid was facing some challenges:

  • islands of data siloed across disparate systems, making it difficult to monitor and accurately assess asset performance
  • no established, standardized process for recording and storing calibration data
  • no commonality in terms of the calibration hardware and software being used across the different stations.

“It was very challenging for us to build a true picture of how our assets were operating, optimize our ways of working, and build business cases for investment,” says Andy Barnwell, Asset Data & Systems Specialist, National Grid. “We were dealing with individual databases and time-consuming calibration processes with multiple steps.”

 

Here's a very short video summary of the story:

 

Automated, paperless calibration to the rescue 

Based on our knowledge of National Grid’s assets and operational process for calibration, Beamex was able to deliver a fully automated and integrated calibration solution that would improve both access to and visibility over asset data through a centralized database.

This package comprised the Beamex MC6-Ex Intrinsically Safe Field Calibrator and Communicator ­and Beamex CMX Calibration Management Software.

 

Sometimes it’s good to put all your eggs in one basket 

Centralizing all their asset data in a single system would provide National Grid with the ability to thoroughly interrogate their assets and make informed decisions about maintenance procedures and schedules. With the Beamex solution in place, National Grid have been able to cut the number of steps needed to perform a calibration, saving 15 minutes per device. This adds up to a time saving of over 4,000 hours per year – and a financial saving that runs into millions of pounds.  

National Grid plan to further expand their use of the CMX system to include the execution of maintenance inspection tasks using the Beamex bMobile application. “We need to make sure technicians’ work is compliant with our policy and procedures. When everyone is following an established, standardized process and all the information is kept in one place, their work is faster, the data it generates is far more reliable, and we can make better decisions about how to manage our assets,” explains James Jepson, Control & Instrumentation Systems Officer at National Grid.

 

A bright future ahead for a constantly evolving partnership

Stepping on the gas with the UK’s National GridThe Beamex solution has not been a hard sell at the compression station sites. “Beamex was very proactive in organizing online training for our teams, but the uptake was less than we expected because they are so easy to use that instead of asking basic questions during the training, our technicians were teaching themselves and quizzing the Beamex team on some fairly in-depth issues instead,” James Jepson says.

There is plenty more to come from this ever-evolving relationship, as Andy Barnwell explains: “The great thing about the Beamex solution from a development perspective is that it’s flexible and offers us a lot of options. We’ll be looking at how to further integrate Beamex solutions into our systems landscape and take advantage of even greater asset management functionalities as and when they are developed in collaboration with Beamex.”

“Automation is the future, and I can see a not-too-distant future when we will have a Beamex solution that will allow us to do everything remotely while still performing periodic on-site spot checks with highly accurate portable devices. The sky really is the limit,” James Jepson concludes.

 

Download the full customer success story

 

Read more Case Stories

To read more case stories like this, click the link below:

Read more case stories >>

 

Products highlighted

Learn more about the products highlighted in this story:

Beamex CMX Calibration Management Software >>

Beamex MC6-Ex intrinsically safe advanced field calibrator and communicator >>

Beamex Mobile calibration application >>

View all Beamex products >>

 

 

 

Topics: Calibration process, Case Story

Calibration Management and Software [eBook].

Posted by Heikki Laurila on Apr 25, 2022

Calibration software ebook Beamex

 

In this blog post, we want to share with you an education eBook focusing on calibration management and calibration software.

This eBook is a handy collection of several software-related articles, some of which have been posted earlier on the Beamex blog.

Just give me the eBook now! >>

  

What you'll learn in this eBook

  • Why use software for calibration management
  • How calibration documentation has evolved
  • How software solves the problem of manual data entry errors
  • Why data integrity matters in calibration processes
  • The benefits of connected calibration maintenance management systems
  • How to automate your calibration management ecosystem
  • How an integrated calibration solution helps you do more with less

 

View a sample of the eBook before downloading >>

 

Download the eBook in pdf format by clicking the below button:

Download the software eBook here!

 

More calibration eBooks from Beamex

If you like to learn by reading eBooks, here are a few other recent calibration-related eBooks:

View all our eBooks and white papers >>

 

Beamex - your partner in calibration excellence!

Please take a look at our offering for calibration management software on our calibration software page.

Contact us to discuss how we can be your partner in calibration excellence!

 

 

Topics: Calibration software, Calibration management

Understanding Pressure Calibrator Accuracy Specifications

Posted by Heikki Laurila on Mar 21, 2022

Understanding pressure calibrator accuracy specifications.

 

Comparing the accuracy specifications of pressure calibrators can be a challenging task because different manufacturers specify accuracy in different ways. This means that you can’t simply compare the numbers given in the specification – you need to understand how these numbers are calculated and what they mean in practice.

In this blog post, I look at the different ways pressure calibrator accuracy specifications are presented, explain what they mean, and compare them, as well as take a brief look at what else you should consider when choosing a pressure calibrator.

 

Table of contents

Accuracy specifications

1. Percent of full scale

2. Percent of span

3. Percent of reading

4. A combined accuracy

5. A split range accuracy

Other things to consider

Long-term stability

Uncertainty vs. accuracy

TAR & TUR vs. calibration uncertainty

Environmental specifications

Additional components

Finally, it's not only about accuracy

Beamex solutions for pressure calibrations

Related blog posts

 

Download a pdf white paper of this article >>

 

Accuracy specifications

First, let’s look at the different ways accuracy specifications are provided by manufacturers and how to interpret them.

 

1. Percent of full scale

Percent of full scale (sometimes written also "% of full scale", or "% FS") is one of the most common ways to specify pressure measurement accuracy, and many process instruments use this kind of accuracy specification.

As the name suggests, you calculate the given percentage value from the full scale of the pressure range, with full scale being the maximum pressure the module can measure.

With percent of full scale, measurements have the same (absolute) accuracy (or error) throughout the whole range. This specification is obviously an easy one to calculate and understand.

It is best suited to technologies where the zero and full scale have a similar likelihood for error or drift, and where it is not possible for the user to easily make a zero correction during normal usage.

With most modern electrical pressure measurement devices, the user can perform zeroing of the pressure measurement by having the pressure measurement open to atmospheric (ambient) pressure and performing a zeroing function. This makes it easy for the user to correct for any zero errors before and after a measurement is taken. Therefore, % FS is not the most suitable accuracy specification for modern electric pressure measurement equipment.

 

Example

For clarification, let’s look at some examples with graphs for all the different specification methods, starting with the "percent of full scale" method.

  • Pressure range: 0 to 200 kPa
  • Accuracy specification: ±0.05 percent of full scale (%FS)

As we can see in the first image below, the accuracy specification is a flat line and remains the same in engineering units (0.1 kPa) throughout the pressure range whether we use %FS or kPa on our Y-axis.

But if we look at the accuracy specification as the accuracy of the measured pressure point (or accuracy as a "percent of reading" value), then the situation is different as the second graph below shows.

 

Percent of full scale accuracy specification.

 

Percent full scale with percent reading accuracy

 

The above graph shows the percentage of the accuracy reading on the y axis. This shows what is happening in practice when you measure a certain pressure with this kind of module, showing how accurate that measurement is compared to the pressure being measured.

We can see that the error of the actual measured pressure will increase pretty quickly if we are measuring a pressure smaller than the full scale.
A %FS specified pressure measurement should be mainly used with pressures close to the upper end of the module, as it loses accuracy pretty quickly at lower pressures. If you measure very low pressures the error of that measured pressure can be huge.

For example, when measuring a pressure in the middle of the range (at 50% point), the error on that reading is already doubled on the error at full scale point. Measuring at 25% of the range point, the error is quadrupled!

If you have pressure modules with %FS accuracy specification, you end up needing several modules as the accuracy deteriorates quickly when measuring lower pressure.

 

Accuracy expressed in ppm or engineering units

These two methods are very close to the percent of full scale method.
Sometimes the accuracy can be expressed in ppm (parts per million) of the full scale. Obviously, as the percentage is 1/100 and ppm is 1/1 000 000, there is a multiplier of 10 000 between the two.

For example, 0.05 %FS equals 500 ppm FS, so it is very similar to the %FS way of expressing accuracy. Of course, ppm can also be used for reading error, but more on that later.

Sometimes accuracy is also expressed in engineering units. For example, in the above example, the accuracy could have also been expressed as ±0.1 kPa, instead of ±0.05 %FS.

 

2. Percent of span

Percent of span is similar to the percent of full-scale method, but instead of calculating the percentage from the maximum range value (full scale), it is calculated from the whole range.

Naturally, if the range starts from zero, there is no difference between %FS and percentage of span accuracy.

A pressure measurement range is anyhow often a “compound” range, i.e. it starts from the vacuum side and continues to the positive side. So, for example, the measurement range could be from -100 kPa to +200 kPa. In this case, the percentage is calculated from the whole span (300 kPa, the difference between the minimum and maximum values) instead of the full scale (200 kPa).

For a fully symmetric pressure range (e.g. -1 bar to +1 bar, or -15 to +15 psi), an accuracy specification of “±0.05 % of span” has twice the error of a “±0.05 % of full-scale” specification.

Example

  • Pressure range: -100 kPa to +200 kPa
  • Accuracy specification: ±0.05 % of span

The above example looks graphically as the image below:

Percent of span pressure accuracy specification

In practice, a compound range is most often not fully symmetrical, with the positive side of the range typically larger than the vacuum side. Of course, the vacuum side can never exceed a full vacuum, but the positive side can be any size.

With a compound range, the positive side does not typically measure to very high pressure, because if a high-pressure sensor is used it will not be accurate on the vacuum range.

 

3. Percent of reading

With percent of reading accuracy specification (sometimes written "% of reading", "% of rdg", or "% rdg"), accuracy is always calculated from the measured pressure value.

With this kind of specification, the absolute size of the error (accuracy) changes as the measured pressure changes.

Obviously, this also means that at zero the accuracy specification is zero, and at very close to zero it is very small or negligible. So in practice, it is very unlikely that you will see a percent of reading specification used on its own.

Traditional dead weight testers commonly have accuracy specified as a percent of reading. In practice, the lowest pressure that can be generated with a dead weight tester is limited by the smallest available weight, or the lowest pressure at which the accuracy specification is valid is specified.

A pure percent of reading accuracy specification is not well suited to electronic pressure measurement devices or calibrators because the accuracy gets very small close to zero, and the accuracy is zero at zero pressure.

That is not practical, as there is always some noise or zero drift, so it is not realistic to provide only a percent of reading accuracy specification for electronic calibrators. If this is the only specification provided, then the range minimum, i.e. the limit below which the accuracy specification is no longer valid, should also be specified.

Percent of reading may also be given as a ppm specification. This is more practical with high-precision instruments (e.g. dead weight testers) as a percentage figure would soon start to have many zeros. As explained earlier, converting a percentage figure to ppm means multiplying the percentage by 10 000.

 

Example

  • Range: 0 to 200 kPa
  • Accuracy specifications: ±0.05 percent of reading

The graphic below has the Y-axis as "% of reading", which is obviously a straight line.

Percent of reading accuracy specification.

 

The below graphic shows a "% of reading" accuracy, with the absolute accuracy (engineering units, kPa in this case) on the Y-axis. We can see that when the pressure value is small, the absolute error is small. As the pressure increases, the absolute error increases.

percent of reading kPa on Y

 

 

4. A combined accuracy (percent of full scale and percent of reading)

This means that the accuracy specification is a combination of percent of full scale and percent of reading.

The percent values of each may be different. For example, the accuracy specification can be expressed as ±(0.01% of full scale + 0.05% of reading).

In practice this means that the "% of full scale" part ensures that there is a valid accuracy specification at zero and close to zero, while the "% of reading" part means that the absolute accuracy specification grows as the pressure grows.

This kind of specification is pretty common for electrical pressure measurement devices.

The below example and graphic illustrates this type of specification.

Example

  • Pressure range: 0 to 200 kPa
  • Accuracy specification: ± (0.01 % of full scale + 0.04 % of reading)

 

Combined accuracy specification.

 

In the above example, the combined accuracy at the full scale value is ±0.1 kPa, which is the same as for the ±0.05% of full scale specification, so the accuracy at the full scale point is the same.

However, because part of this specification is given as percent of reading, the module is pretty much more accurate at lower pressure than a 0.05% of full scale module.

So, this kind of pressure calibrator is better at performing calibrations at lower pressures without sacrificing accuracy than a calibrator with only a percent of full scale accuracy specification. Also, with this kind of combined specification you end up needing less different range pressure modules, as they are more accurate on a wider pressure range.

 

5. A split range accuracy 

This means that the lower part of the pressure range has a fixed accuracy (% of full scale, % of span, or engineering unit) and the upper part has a percent of reading specification.

This is another way for manufacturers of electrical calibrators to ensure that they can provide credible accuracy specifications at and close to zero, and also that the absolute accuracy specification increases as the pressure increases.

The lower part of the range may be specified as a percent of full scale, part of the scale, or as a percent of a (fixed) reading. It may also be given in engineering units.

In practice this means that the lower part is “flat” and the upper part is growing. The example and graph below illustrate this concept.

Example:

  • Pressure range: 0 to 200 kPa
  • Accuracy specification: "0.01% of full scale" for the first third of the range plus "0.05% of reading" for the rest of the range

 

Split range accuracy specification.

 

Download a pdf white paper of this article >>

 

Other things to consider


Long-term stability

Often, the given accuracy specification is not valid for longer periods of time and does not include any long-term drift specification. Be sure to read the small print in the calibrator’s documentation to find out if this is the case.

If you calibrate the pressure calibrator once a year, for example, it is important to know what kind of accuracy you can expect from the calibrator just before the next calibration, i.e. 11.5 months after the previous one.

For electrical pressure calibrators, where the user can make a zero correction, the zero does not drift over time (or it can be zeroed away by the user).

But the user can’t correct the drift at higher pressures (span drift). The drift normally changes the accuracy at higher pressures, typically adding a “percent of reading” type drift over time, so the full-scale span typically drifts more over time.

When choosing a pressure calibrator, be sure to check its long-term drift specification.

 

Uncertainty vs. accuracy

Another vital consideration is what components the accuracy specification includes.

Some calibrators offer an uncertainty specification instead of an accuracy specification. Typically, this means that the uncertainty specification also includes the uncertainty of the reference standards used in the calibration laboratory when manufacturing and calibrating the calibrator. Also, it often specifies the validity period for the specification, for example one year.

Generally speaking, uncertainty is a more comprehensive concept than accuracy. I could write a lot about uncertainty, but for now it’s enough to mention that you should make sure that you know all the uncertainty components relevant to the whole calibration event because the total uncertainty of the calibration process is often much more than just the calibrator’s specification.

If interested, you can read more about uncertainty in the Calibration uncertainty for dummies blog.

 

TAR & TUR vs. calibration uncertainty

Commonly used criteria for calibrator (reference standard) accuracy is the test accuracy/uncertainty ratio (TAR and TUR). This is the ratio of accuracy or uncertainty between the calibrator and the instruments to be calibrated with it and it is used to determine the level of accuracy you require from your calibrator. The bigger the ratio, the better it is. Common traditional industry practice is to use a 4 to 1 ratio.

Often the process instruments to be calibrated will have a percentage of full scale accuracy specification, while the calibrator may have (partly) a percentage of reading specification.

In practice, this means that the calibrator’s accuracy is greater than that of the process instrument when the measured pressure is smaller than the full scale.

So even if the test accuracy ratio (TAR) is not big enough at full scale pressure, it gets bigger (better) as you measure a lower pressure. The example below explains this.

I think the below graphic needs some explanations:

  • The blue line is the process instrument's accuracy (to be calibrated), it is 0.2 % of full scale (=0.4 kPa) [Left Y axis]
  • The green line is the calibrator accuracy, being 0.05 % of reading [Left Y axis].
  • The red line is the TAR (test accuracy ratio), i.e. the ratio been the two above accuracies (read on the right Y-axis). We can see that the TAR is 4 at the full scale value, but as soon as the pressure comes smaller the ratio increases a lot because the calibrator has a "% of reading" specification while the process instrument is a "% of full scale" [Right Y axis]

 

TAR accuracy ratio

 

The main takeaway with this (maybe confusing) above graphic is that the TAR should be calculated at different pressure values. Even if it looks like not being enough at full scale, it may be well enough at lower pressure, assuming the calibrator accuracy has at least partially a "% of reading" component.

Please note that a TAR only includes an accuracy comparison, which means it is missing all the uncertainty considerations. In practice, calibration processes can include other larger uncertainty sources than the calibrator, so it is important to determine the total uncertainty of the calibration process.

 

Environmental specification

It is important to read the specifications carefully to understand which environmental conditions the given specifications are valid for. If you are performing calibrations in the field rather than in a controlled environment like a laboratory or workshop, then the conditions will vary a great deal.

Sometimes the given specifications are valid only at a specific temperature or within a limited temperature range. There can be a separate specification outside that range, or a temperature coefficient.

Other environmental conditions to consider include humidity, altitude, ingress protection, orientation effect, warm-up time, and shock/vibration. 
In summary, be sure to check the environmental specifications that are relevant for you when comparing pressure calibrators.

 

Additional components

Does the specification include all relevant components – like hysteresis, nonlinearity, and repeatability – or are these specified separately and need to be added to the specification.

 

Finally, it’s not only about accuracy

Although accuracy and uncertainty are vital considerations when choosing a pressure calibrator, there are also other things to consider when selecting one, such as:

  • Does the calibrator include overpressure protection? It is difficult to avoid over-pressurizing a pressure measurement device every now and then. Some pressure calibrators have sensors that can be damaged by even the slightest amount of overpressure, while others can withstand a certain level of overpressure without damaging the sensors or affecting the accuracy. For example, the Beamex MC6 Advanced Field Calibrator and Communicator includes a built-in relief valve to release overpressure and prevent damage.
  • Does the calibrator come with an accredited calibration certificate ensuring the formal metrological traceability? If not you may need to have it calibrated separately.
  • How conservative or aggressive are the specifications? Although difficult to see, it would be good to try to find out if the company normally gives reliable conservative specifications, or if it gives aggressive figures.
  • The brand of the company. Who manufactures the device? Is the manufacturer reliable?
  • What are the warranty terms and conditions, and is there the option to extend the warranty and purchase a service plan to cover items that don’t fall within its scope?
  • Are there training services available? How is the support arranged and available? How is recalibration arranged?
  • What functions other than pressure measurement does the calibrator provide that are useful for you? 
    • For example, a 24 V loop supply and accurate mA measurement are handy if you plan to calibrate pressure transmitters.
    • Most transmitters are HART enabled, so if the calibrator includes a HART communicator, you don’t need to carry a separate communicator with you.
  • Calibrations need to be documented, so how do you plan to do that? A documenting pressure calibrator will automatically document calibration results and can communicate with your calibration software, saving you time and effort and eliminating the risk of manual data entry errors.

 

Download a pdf version of this article by clicking the image below:

Understanding pressure calibrator accuracy specifications - Beamex White Paper

 

Beamex solutions for pressure calibration

Beamex offers various different high-accuracy solutions for pressure calibration, such as:

  • Portable pressure calibrators; MC6MC2MC4.
  • Intrinsically safe pressure calibrators; MC6-Ex.
  • Pressure calibrators for workshop solutions; MC6-WS.
  • Automatic pressure controllers; POC8.
  • Portable electrical pressure pump; ePG.
  • Calibration hand pumps; PG series.
  • Calibration management software solutions; CMXLOGiCAL.

Check out all Beamex pressure calibrators.

 

Related blog posts

Beamex blog includes several posts related to pressure calibration, but if pressure is your thing, please check at least these:



 

Topics: Pressure calibration

How to avoid safety and compliance issues in fine chemicals

Posted by Monica Kruger on Feb 08, 2022

How to avoid safety and compliance issues in fine chemicals

Safety and compliance are non-negotiable in the fine chemicals industry, which produces complex, pure chemical substances such as active pharmaceutical ingredients. Fine chemicals are batch driven, with complicated, multistage processes where accuracy and efficiency are critical. 

In this industry, one of the keys to ensuring both safety and compliance is that measurements taken throughout the production process are accurate, which can be challenging to say the least with paper-based calibration. Paper-based calibration is time consuming and because it relies on manual data entry, prone to errors.

It’s commonly accepted that the typical error rate in manual data entry is about 1%, which while it might not sound like a lot can have huge implications for your calibration process. (Read more about in our blog post "Manual data entry errors".)

How can automated calibration help to address these challenges?

 

Read more about the benefits of automated calibration for the fine chemicals industry in our white paper (pdf format).

How to avoid safety and compliance issues in fine chemicals


A paperless process to the rescue


Automated calibration solutions combine software, hardware, and calibration expertise to deliver an automated, paperless flow of calibration data with minimal need for manual data entry. This not only saves time and makes the process far more efficient, but it also helps to avoid mistakes typically associated with manual data entry – thus improving the quality and integrity of the calibration data. 

Furthermore, calibration results are safely stored, tamper proof, and easily accessible in the calibration software for review, for example for audit or analysis purposes.


Safety, compliance, and continuous improvement


Removing human error reduces the chance that a production batch will be rejected due to out-of-tolerance calibrators and helps to ensure compliance with regulations like GMP and 21 CFR part 11.

Automating calibration processes also brings significant financial benefits. For example, if an instrument is found to be out of tolerance, at minimum it requires that the product is quarantined and subject to risk analysis and investigation. In the worst case, the entire batch will have to be discarded, increasing waste and leading to large financial losses. 

What’s more, with calibration data in digital form rather than sitting in siloed paper archives it can be integrated with ERP systems, helping management to understand what’s going on and supporting better decision-making. And with everything in one easily accessible system, data across factories can be easily compared to spot trends and identify areas for improvement. 

It’s important to remember that any automated calibration solution should be based on a thorough analysis of your specific needs to ensure the process is well designed and error-free. This is where working with a trusted advisor who can help analyze the process and find areas for improvement really pays off.

 

Read more about the benefits of automated calibration for the fine chemicals industry in our white paper.

Download the free pdf by clicking the picture below

How to avoid safety and compliance issues in fine chemicals

 

Beamex calibration solutions

Learn more about Beamex products and services on our website or contact your local Beamex representative:

 

Visit Beamex website

Contact Us

Beamex Worldwide Contacts

 

Related fine chemicals articles

 

Related blog posts

Download your copy of the Calibration Essentials Software eBook to learn more about calibration management and software.

Calibration Essentials- Software eBook

 

Topics: Calibration software, Calibration management, Calibration in fine chemicals, Calibration in chemical industry

Calibration management - transition from paper-based to digital

Posted by Tiffany Rankin on Jan 11, 2022

Calibration management - transition from paper-based to digital - Beamex blog article

 

A Tale of Three Steves

Stephen Jerge, Calibration Supervisor for Lonza Biologics, a multinational chemical and biotechnology company, recently walked attendees of the Beamex Annual Calibration Exchange (ACE) through the project he headed to transition from a paper-based calibration management system (CMS) to an integrated, digital, paperless solution using Beamex CMX. Steve has over 30 years of calibration experience in the telecommunication and pharmaceutical industries.

Over the last 3 years, his primary focus has been contributing to Lonza’s global paperless SAP/CMX integrated solution through implementation, training, and supporting calibration operations and expansion projects.

Watch the presentation video recording now!

In this blog post, we’ll share The Tale of Three Steve’s as we follow his journey from 2017 Steve, a stressed-out, overworked supervisor; to 2019 Steve as he underwent the rollout of a new, automated system; to Steve 2021, whose focus is on continuous improvement.

 

2017 Steve - Managing a paper-based process

Steve starts by bringing us back to 2017 when the calibration process was paper-based and manual. “It couldn’t be more inefficient,” notes Steve. “Everything from standards selection, approvals, and calculations were done manually. All historical data was "safely locked up in a filing cabinet" and not doing any good at all.”

Below – is a visual representation of the paper-based process. As Steve notes, “Each arrow in this image is time and money and each person is an error trap.”

paper 7 steps

 

Does any of this sound and look familiar to you? Keep reading to find out how Steve got out of this heavily manual and error-prone process.

Clearly, something needed to change. Steve and the management team at Lonza outlined their main priorities: Quality, Consistency, Efficiency, and Cost.

Quality

As a pharmaceutical manufacturer, quality is their first priority. With the existing methods, they were doing at about 20% technical review, as a quality component, and everything else was getting a GMP (Good Manufacturing Practice) review at an administrative level. They wanted to leverage a CMS system so they could have a 100% review done as the work was completed, as opposed to days or weeks later when the manual review was performed. They wanted to be able to reduce human errors before they happened.

Consistency

With a paper system, it’s easy to make mistakes such as calculation errors.

Efficiency

If we look at the image above outlining the paper-based steps, automation doesn’t remove the arrows, but it makes them easier, more streamlined, and ultimately takes them out of the hands of the technicians. This allows technicians more time to focus on their roles.

As Steve states, “If you’re running a NASCAR race and the technician must get out of the car, change the tires, clean the windshield and add the fuel, you’re going to come in last place every time.” An automated process gives the technician the time and resources to do the work that needs to be done.

Likewise, an automated process means you’re able to take the knowledge a tenured technician has gathered over the years and include it as part of the process. This way, the next person that must do the work, be they a contractor or new technician, has all the information needed to be successful.

Managing by exception* – the CMX process reviews the work order and flags any issues that need review.

*Learn more about this by listening to the roundtable discussion from day one of the Annual Calibration Exchange.

Cost

Clearly, this is a huge issue. Rework means they need to reclean, reschedule, stop production, etc. All of these cost money. By leveraging the CMX technology, they make rework less necessary and have improved processes.

 

improve your processes

 

2019 Steve – Implementing automation

After outlining all their priorities and deciding to implement Beamex CMX as their CMS of choice, Lonza was ready to go from a paper-based process to a fully integrated SAP Computer Maintenance Management System (CMMS) process.

The first benefit of the new system is the new automated review process. Because CMX includes 7 built-in checks, only those items with red flags need to be reviewed. Technicians can also flag work orders for a second review if any questions arise during the calibration process. All other work orders can be closed in as little as one hour.

In 2019, Lonza also moved to use the Beamex MC6 as their primary calibrator. The MC6 replaced 5 or more types of standards. Because the MC6 meets all the functions for each standard, the technicians can now use the technology to be more efficient.

Prior to automating the process, Lonza was having issues keeping up with items that were nearing being past due for calibration. By leveraging the integration between CMX and SAP, the Calibration Team at Lonza was able to make huge improvements in scheduling and tracking. Utilizing SAP also allows them to manage external vendor orders. Now, internal and external items can be efficiently tracked.


Let’s look at how the SAP to CMX integration works:

sapcmx

 

In short, SAP cares about the when, where, and who of the process. CMX cares about HOW. CMX provides preselected calibration standards with standard warnings, test point warnings, passed/failed/adjust warnings, etc. CMX can also perform calculations and conduct an automated review.

 

2021 Steve – Continuous improvement

With the integrated process fully implemented, Steve can now focus on ways to continue to make life easier for the technicians. They have added pre-work instructions for each work order. This allows them to take the knowledge from the seasoned technician and load that information into CMX. Now, the location, special tools/fittings information, post-work instructions, etc. are readily available.

Checklist/inspection functions were also added. These include a combination of simple instructions, plus critical tasks that may not be calibration-related but are essential to the function of the equipment.   This reinforces procedure, is paperless, provides another level of quality, and reduces time on deviation.

From a technician’s perspective, life is pretty straightforward now. You check out the calibration, execute the calibration (following along with all the defined steps), then check-in and close. Technicians can provide feedback through the second approval process or by adding notes in CMX. This information is brought to management for review and can be triaged based on how critical the notes are.

3 step process

Final Words

Steve summarizes his journey by saying, "When you implement a major change – and take people out of their comfort zone – it can turn lives upside down. Management needs to support their technicians and their team through these changes. If they can do this successfully, the technician’s job will be easier in the long run and supervisors will have the ability to manage by exception, focus on continuous improvement, and work with a happier and more productive team."

 

To learn more about how Steve has moved to managing by exception with CMX, take a look at the round table discussion from ACE 2021.

 

Check out Steve Jerge's video presentation, plus other insights from industry experts, on the Beamex 2021 Annual Calibration Exchange Video Library

Watch Now


Want more information on CMMS and Calibration System Integrations?

Check out these blog posts:

Manual Data Entry Errors

Bridging the Gap

Topics: Beamex MC6, Calibration process, Calibration software, CMX, Data Integrity, Calibration management

Automating the calibration management ecosystem

Posted by Heikki Laurila on Dec 01, 2021

Automating the calibration management ecosystem

It’s time to say goodbye to error-prone paper-based calibration!

While the pen might be mightier than the sword, in process industries pen-and-paper based calibration systems are a weak spot that is desperately in need of attention. Automating your calibration ecosystem saves time, reduces the risk of errors, and enables your organization to use calibration data for business benefits.

But what are the steps involved in implementing an automated process, and what support is on offer to help you succeed? Read on to find out.

In process industries, despite the fact that calibration is a business-critical process, engineers and technicians are still drowning in time-consuming, unreliable, and error-prone paper trails.

Automating and streamlining the calibration process is key to improving efficiency and avoiding errors, but for many going digital can feel like a daunting step.

 

What is the calibration ecosystem?

The calibration ecosystem is made up of everything that’s connected to the calibration process: calibration data, calibration management software, services, and expertise.

Expertise on the part of the calibration provider ensures that the system being delivered is compliant and meets the client’s unique requirements, that the roll out of that system goes smoothly, and that the personnel who will use the system are properly trained to do so.

In terms of hardware, it’s no surprise that calibrators are on the front line. In a modern, automated system this means documenting multifunction units that can provide a traceable reference, calculate measurement error, and even perform calibrations automatically and then document and store the results ready for uploading to the calibration software.

The software then handles tasks like calibration planning and scheduling, analysis and optimization of calibration frequency, and reporting – and can be integrated with maintenance management systems.

 

So, why do I need to automate my calibration infrastructure?

Automation can help process industry operators to thrive with the help of streamlined, accurate, and efficient calibration processes that ensure compliance and ultimately improve business performance. The headline benefits include:

  • Better planning and decision making
  • Ensured compliance
  • Time and cost savings
  • Full traceability with everything documented in one system
  • Improved analysis capabilities
  • More efficient and effective maintenance processes
  • Robust quality assurance

 

OK, I’m sold. What’s next, and how can Beamex help me?

Whether you’re taking an instrument-based or a process-based approach to calibration, the first step is to classify all your devices as critical or non-critical, decide on their calibration intervals, and choose the right tools and methods for calibration.

After that comes staff training – for maintenance technicians, service engineers, process and quality engineers, and managers – to ensure you can get the best possible return on your investment. Finally, there’s execution and analysis, where staff carry out calibrations according to a carefully defined set of instructions and safety procedures and the results are analyzed to identify where there’s room for improvement.

Beamex can act as a trusted advisor throughout the entire process of creating an automated calibration ecosystem, helping you to evaluate your current processes, identify areas for improvement, and ensure a smooth transition to a new and optimized process.

Beyond expertise, our solution offering covers on-premises calibration softwarecloud-based calibration software, and a mobile application for paperless calibration, documentation, and inspection in the field.

In addition, different calibration tools and various services are being offered.

 

To find out more about what’s involved in automating your calibration management ecosystem and how we can help, download our white paper.

Automating the calibration management ecosystem - Beamex blog

 

Other related content

Download your copy of the Calibration Essentials Software eBook to learn more about calibration management and software

Calibration Essentials- Software eBook

 

Topics: Calibration software

Improving efficiency and ensuring compliance in the pharmaceutical industry

Posted by Monica Kruger on Oct 05, 2021

Improving efficiency and ensuring compliance in the pharmaceutical industry 1500x500px

Today, it seems like everyone is talking about digitalization – and for good reason. Done properly, digitalization can unlock a whole host of benefits for companies including greater efficiency, cost savings, and the ability to do data analysis.

But in order to harness these rewards, digitalization needs to be carried out in a smart way – especially in the pharmaceutical industry where compliance and patient safety are the key drivers.

Digitalizing calibration 

One area ripe for digitalization is calibration. Calibration is still largely paper based in the pharma industry, which means there is room for human error across the many steps required.

Process instrument calibration is just one of many maintenance-related activities in a manufacturing plant, and it doesn’t make sense for companies to use their limited resources and time performing unnecessary calibrations or following time-consuming, ineffective calibration procedures.

The use of paper for calibration also means that a huge potential resource – data from calibrations – is being wasted as it’s sitting in binders in a storage room rather than being easily available for analysis. 

How automated calibration works

An integrated calibration solution is a smart way to digitalize calibrations.

Such a solution combines the actual calibrators, centralized calibration software, and industry knowledge to deliver an automated and paperless flow of calibration data.

This means moving away from resource-intensive manual data entry towards an automated system where everything is validated automatically by a single technician using a multifunctional device – in real time and with no room for human error.

 

The benefits

The benefits of digitalizing and automating calibration are numerous and include:

  • Ensuring patient safety and compliance by ensuring that instruments are operating within tolerances
  • Each calibration takes less time, improving operational efficiency
  • Smart calibrators can provide guidance to technicians to decrease errors during calibrations
  • Management can make more informed decisions based on current data
  • The integrity of calibration data is kept safe in a tamper-proof central repository
  • Data can be found quickly and easily for audit purposes

 

How to ensure successful digitalization

In order to make sure digitalization serves a useful purpose and fulfills its potential, several things are needed.

Firstly, the proper expertise to ensure that systems are in compliance with the Food and Drug Administration’s Good Manufacturing Practice (GMP) and other regulatory requirements. The GMP requirement 21 CFR Part 11, which regulates how the calibration certificate is documented and signed electronically, must be followed in order to create a compliant process.

Secondly, the actual calibration solution software and hardware need to be designed in a way that minimizes or removes the need for human input. This reduces the chance of error and removes the need for the “four eyes” principle – where a second set of eyes are needed to confirm calibration data is recorded correctly.

Finally, software tools need to be available to quickly access data, for example for audit purposes, as well as to carry out trend or other analysis on calibration data. This data can also be used to predict when a device is drifting out of tolerance to optimize maintenance, or for comparing performance between factories to optimize efficiency. 

 

To find out more about what digitalizing calibration means in practice, read our white paper.


New Call-to-action

Related articles

Related blog posts



Download your copy of the Calibration Essentials Software eBook to learn more about calibration management and software.

Calibration Essentials- Software eBook

 

Topics: Process automation, Calibration in pharmaceutical industry

Pressure Calibration [eBook]

Posted by Heikki Laurila on Jun 22, 2021

Beamex pressure ebook cover image

 

In this blog post, we want to share a free educational eBook on Pressure Calibration and other pressure-related topics.

The eBook is titled "Calibration Essential: Pressure" and we have developed it together with Automation.com, a media brand of the International Society of Automation (ISA). 

Some of these articles have been previously posted in the Beamex blog, but now we have collected several pressure-related articles into one handy eBook.

Just give me the free Pressure eBook pdf now! >>

 

Pressure Calibration eBook - contents

The eBook starts with a few general educational pressure-related articles and includes several technical “How to calibrate” articles.

The eBook has 40 pages and contains the seven (7) following articles:

Pressure Calibration Basics: Pressure Types (Page 5)

    • Different pressure types or modes are available, including gauge, absolute, and differential.

What is Barometric Pressure? (Page 8)

    • This article takes an in-depth look at barometric, or atmospheric, pressure.

Pressure Units and Pressure Unit Conversion (Page 13)

    • It is important to understand the basics of different pressure units and pressure unit families to avoid potentially dangerous misunderstandings.

Calibrating a Square Rooting Pressure Transmitter (Page 18)

    • There are many questions about the calibration of a square rooting pressure transmitter, with the frequent concern that the calibration fails too easily at the zero point.

Pressure Transmitter Accuracy Specifications: The Small Print (Page 21)

    • Pressure transmitters’ accuracy specifications have many different components that go beyond the specification listed in the advertising brochure, which might tell only part of the truth.

How to Calibrate Pressure Gauges: 20 Things You Should Consider (Page 27)

    • Pressure gauges need to be calibrated at regular intervals to ensure they are accurate.

Pressure Switch Calibration (Page 35)

    • Pressure switches are more difficult to calibrate than transmitters, but proper calibration is important for accuracy and reliability.

 

Download the free Pressure eBook here!

 

More Beamex eBooks

Here are links to some of our other popular calibration-related eBooks:

You can find all our eBooks and White Papers here: White Papers and eBooks.

Here's a webinar you may enjoy:  Differential pressure flowmeter calibration - Best practices in the field.  Watch now!

Find a better way for your pressure calibrations!

If you want to find a better way for your pressure calibrations, please get in touch with our pressure calibration specialists:

Contact Beamex pressure calibration experts! 

If you work with pressure calibration, please check out the latest Beamex pressure calibrators:

Beamex Pressure Calibrators

 

 

Topics: Pressure calibration

The Evolution of Calibration Documentation

Posted by Tiffany Rankin on May 20, 2021

Evolution of calibration documentation

 

Our modern history is defined by the advent of writing. Writing is humankind’s principal technology for collecting, manipulating, storing, retrieving, communicating, and disseminating information. Before we learned to write, we lived in an era referred to as pre-history, or prehistoric times. As humans evolved, began cultivating land, and started living a less nomadic existence, the documentation of events became more sophisticated. Cave drawings gave way to hieroglyphics; stone tablets evolved into scrolls and then into bound books; the invention of typeset documents gave more and more people access to the written word. Today, we can send emails, text messages, and a variety of other digital communication around the world in a matter of seconds. Humans have evolved and documentation has evolved, and with it the way in which we manage calibration.

In the beginning, there was no way to document calibration findings other than with a pen and paper. This information was brought back from the field, entered into a form, and filed away. Just as in the Library of Alexandria (one of the largest and most significant libraries in the ancient world) with its thousands of papyrus scrolls, managing hundreds or even thousands of paper calibration documents comes with the inherent risk of misplaced, lost, or damaged documents – in the case of the Alexandria library, caused by a fire allegedly started by Julius Caesar. Additionally, a paper and pen system is labor-intensive, time-consuming, prone to errors, and provides little to no opportunity to analyze historical trends.

 

Download a pdf version of this article!The evolution of calibration documentation - Beamex white paper

 

Digital systems enter the scene

Databases

As we progress through time, more digitalized systems of calibration management have emerged including the use of spreadsheets and databases. While certainly a step in the right direction, this method of documentation still has its drawbacks. Similar to the pen and paper method, this form of recording calibration data is still time-consuming and error-prone. It also lacks automation in that reminders and tasks cannot be set up on instruments that are due for calibration.

Read the blog post: Manual Data Entry Errors (March 2021)

Software systems

The use of software to manage calibration reports was the next giant leap. The calibration module within some maintenance management software allows instrument data to be stored and managed efficiently in a plant’s database. But again, this method falls short due to lack of automation, limited functionality, and often non-compliance with regulatory requirements (for example, FDA or EPA requirements) for managing calibration records.

 

Dedicated calibration solutions

Advances in technology seem to come faster and faster. Today, dedicated calibration software is the most advanced solution available to support and guide calibration management activities. With calibration software, users are provided with an easy-to-use Windows Explorer-like interface. The software manages and stores all instrument and calibration data. This includes the planning and scheduling of calibration work; analysis and optimization of calibration frequency; production of reports, certificates, and labels; communication with smart calibrators; and easy integration with maintenance management systems such as SAP and Maximo. The result is a streamlined, automated calibration process that improves quality, plant productivity, safety, and efficiency.

In order to understand how this type of software can help better manage process plant instrument calibrations, it is important to consider the typical calibration management tasks that companies undertake. There are five main areas here: planning and decision-making, organization, execution, documentation, and analysis.

 

Planning and decision-making

Instruments and measurement devices should be listed and classified into ‘critical’ and ‘non-critical’ devices, with calibration ranges and required tolerances identified for each individual device. The calibration interval, creation, and approval of standard operating procedures (SOPs), and selection of suitable calibration methods and tools should also be defined. Finally, the current calibration status for every instrument should be identified.

Organization

Organization involves training the company’s calibration staff in using the chosen tools and how to follow the approved SOPs. Resources should be made available and assigned to carry out the scheduled calibration tasks.

Execution

The execution stage involves staff carrying out assigned calibration activities and following the appropriate instructions before calibrating a device, including any associated safety procedures.

Documentation

Unlike many of the more archaic methods, calibration software generates reports automatically, and all calibration data is stored in one database rather than multiple disparate systems. Calibration certificates, reports, and labels can all be printed out on paper or sent in electronic format.

The documentation and storage of calibration results typically involve electronically signing or approving all calibration records generated.

Analysis

Improvements in documentation lead to improvements in analysis. Using specialized calibration management software enables faster, easier, and more accurate analysis of calibration records and identification of historical trends. Also, when a plant is being audited, calibration software can facilitate both the preparation process and the audit itself. Locating records and verifying that the system works is effortless when compared to traditional calibration record keeping. Regulatory organizations and standards such as FDA and EPA place demanding requirements on the recording of calibration data. Calibration software has many functions that help in meeting these requirements, such as change management, audit trail, and electronic signature functions.

Based on the results, analysis should be performed to determine if any corrective action needs to be taken. The effectiveness of calibration needs to be reviewed and calibration intervals checked. These intervals may need to be adjusted based on archived calibration history. If, for example, a sensor drifts out of its specification range, the consequences could be disastrous for the plant, resulting in problems such as costly production downtime, safety issues, or batches of inferior quality goods being produced which may then have to be scrapped.

 

Just as advancements in tools and the proliferation of the written word has helped shape the evolution of humans, advancements in calibration documentation shape the efficiency and productivity of plants using these technologies. By replacing manual procedures with automated, validated processes, efficiencies should improve. Reducing labor-intensive calibration activities will lessen costly production downtime, while the ability to analyze calibration results will optimize calibration intervals, saving time and increasing productivity.

Every type of process plant, regardless of industry sector, can benefit from using calibration management software. Compared to traditional, paper-based systems, in-house legacy calibration systems, or calibration modules of maintenance management systems, using dedicated calibration management software results in improved quality and increased productivity, and reduces the cost of the entire calibration process.

 

Calibration software also gives users access to data and historical trends, and these insights help plant personnel to make better decisions. For example, when a piece of equipment needs to be upgraded it can be difficult to get approval based on speculation. Being able to show data of the inconsistencies and malfunctions makes the approval process much easier. In addition, as the volume of work for calibration technicians increases, having insights into the process can facilitate a more streamlined and efficient work schedule. This will in turn improve reliability, make it easier for technicians to manage their workflow, and contribute to a safer and more well-organized process.

 

As we become a more advanced society our need to share information progresses, as do our methods of collecting, manipulating, storing, retrieving, communicating, and disseminating information. While simply writing calibration data down with a pen and paper is still an effective way of collecting information, it lacks efficiency and hinders the ability of people further down the line to retrieve and process the information. While databases and maintenance management software are certainly steps in the right direction, they still miss the mark when it comes to disseminating data in a useful and streamlined way. Implementing calibration software makes it easier to collect, store, analyze, retrieve, and share information. Until the next technological leap forward, calibration software remains the most advanced solution available to support and guide calibration management activities.

 

Evolution of Beamex calibration software in brief

Here's a brief list of Beamex's main software products.

Beamex historical CALDB calibration software

Beamex PDOC (1985)

The very first calibration software Beamex released was the PDOC software back in 1985.

The PCAL software automated the documentation of pressure calibration by communicating with a bench-mounted pressure calibrator. It printed a calibration certificate on a narrow paper with the thermal printer integrated in the Epson computer.

That was a software that was stored on a small cassette and was used with a kind of a portable Epson computer.

Later, a corresponding TDOC program was release for documenting temperature calibrations.

 

CALDB1 / CALDB3 (Late 80's)

CALDB – Calibration Database – was a DOS-based calibration database software. Our first one for personal computers.

Later, an adder HISDB was introduced for reviewing the history of calibration results.

 

Beamex QM6 Quality Manager - Calibration Management Software (1996)

The Beamex QM6 was our first calibration management software that run in Windows operating system. It had a database for instruments, references and calibration results. It had communication with documenting calibrators, so you could send calibration procedure (work order) to documenting calibrator and receive the results back to QM6 after the calibration was completed.

 

Beamex QD3 Quality Documenter (1996)

QD3 was software for documenting calibration results. It did not have the same functionality as the QM6 but was a simpler version. It could anyhow communicate with documenting calibrators.

 

Beamex CMX Calibration Management Software (2003)

The very first version of the Beamex CMX calibration management software was launched already in 2003 and it was our first Windows software. The first versions were pretty limited in functionality compared to what CMX is today.

During the years, the CMX technology and functionality have been developed continuously and CMX is still very much under active development. Today, the CMX includes a huge amount of functionality, including seamlessly integrating with many maintenance management systems, and suits smaller customers as well as large enterprise installations.

A lot of functionality has been developed together with leading pharmaceutical customers related to the functionality required in the regulated pharmaceutical industry.

 

Beamex bMobile calibration application (2016)

Beamex bMobile is a calibration application that can be installed on Android or Windows mobile devices. The bMobile can be used to document calibration results with a mobile device.

The bMobile communicates with Beamex CMX and Logical calibration software, so calibration work can be sent to bMobile and results received back to software.

 

Beamex LOGiCAL 1.x (2018)

The first version of the Logical cloud-based calibration software was a simple documenting software that could read calibration results from a documenting calibrator and convert the results into a pdf calibration certificate.

The Logical 1.x has been replaced with Logical 2.x.

 

Beamex LOGiCAL 2.x (2020)

The current Logical 2.x is a subscription-based and cloud-based calibration software as a service. It has a database to store instruments, references and calibration results. It can synchronize procedures to Beamex documenting calibrators and Beamex bMobile, and also synchronize calibration results back to Logical from mobile devices.

Beamex LOGiCAL calibration software

 

Keep up with Beamex advancements by subscribing to Product News.

Subscribe today

Ready to get your calibration process out of the stone ages? Contact Beamex today.

Contact Us

 

Download your copy of the Calibration Essentials Software eBook, to learn more about calibration management and software.

Calibration Essentials- Software eBook

 

 

Topics: Process automation, Calibration software, CMX, Data Integrity, Digitalisation

Manual Data Entry Errors

Posted by Heikki Laurila on Mar 25, 2021

banner-for-manual-data-entry-errors_1200px_v1

 

Many businesses still use a lot of manual entry in their industrial processes.

This is despite the fact that it is commonly known and accepted that it is a slow and labor-intensive process and there are always human errors related to manual data entry - Human errors are natural.

It is commonly accepted that the typical error rate in manual data entry is about 1 %.

What does this 1 % mean in practice in calibration processes, and how can you make it smaller, or even get rid of it?

This article mainly focuses on industrial calibration processes and the manual data entry related to these processes.

 

Table of content

 

Download a pdf version of this article!

Common manual data entry steps in calibration processes

To start with, let’s take a look at the common ways in which data is handled in industrial calibration processes:

1. Pen & paper

It is still very common that calibration data is captured in the field by writing it on a paper form during the calibration process. Later on, back in the workshop, the calibration data from the paper is manually typed into a computerized system, in some cases by another person.

So with this very common process the calibration data is entered manually twice: first with pen and paper and later when it is typed into the system.

 

2. Manual entry into a calibration system

Another common way is to document the calibration data by typing it into a computer system, using spreadsheet software like Microsoft Excel or dedicated calibration software. If you want to type straight into a software program you need to carry a laptop in the field and you need to be connected to a network, which is not always possible in industrial environments.

If it is not possible to enter the data straight into the calibration application using a computer, it may in some cases be entered on a mobile device with a relevant application and then later electronically transferred into the calibration software.

In this process the data is still entered manually, although only once, not twice like in the previous process.

 

3. Electronic storing of data

The most modern way is to use calibration equipment that can store the calibration data in its memory fully electronically. The calibration data can then be transferred from the calibrator’s memory into the calibration software, again fully electronically.

This kind of process does not include any manual data entry steps. This eliminates all the human error and is also faster as it does not consume the engineer’s time.

This process works only for calibrations where the calibration equipment can measure (or generate/simulate) instrument input and output. If there are any gauges, indicators, displays, or similar that need to be read visually, some form of manual data entry is needed.

But even if some of the calibration data is manually entered into the calibrator, the calibrator may have a feature to check that the data is within accepted values and may also have an informative graphical indication of the data quality for easy verification.

The calibration data is then sent electronically from the calibrator to the calibration system.

 

Manual data entry versus documenting calibrator

On the above picture, the left side shows an example where the calibration data has been entered manually on a paper form. Possibly some numbers have been entered incorrectly, it is difficult to read some of them, manual error calculation is difficult, is that tick a pass or a fail, who signed that, and so on. 

On the right side you can see the same calibration with a Beamex MC6 documenting calibrator. All calibration data is stored automatically and electronically in the calibrator's memory, errors are calculated automatically, pass/fail decision is done automatically, the results are sent electronically to calibration software for storing and certificate printing.  

Which one delivers more reliable calibration data?

(well, that was not really a questions, it is the MC6 calibrator of course)

 

What about the 1 % typical error rate?

It is obvious that there are errors in manual data entry. It seems to be a commonly accepted rule that in manual data entry, human errors will cause a 1 % average error rate.

This error rate is based research published on several articles, but I must admit that I don’t know the scientific background for it. We can argue about what the real error rate is, but we can all agree that there are always errors in manual data entry.

After reading about this 1 % error rate in a few places, it got me thinking about what this means for calibration processes. So, let’s stick with that 1 % average error rate in the following considerations.

The error rate can grow quickly if the data to be entered is complicated, if the user is tired or in a hurry, and for many other reasons. For example, some people may have “personal” handwriting (I know I do), which is difficult for others to read.

To reduce errors, companies can train employees, highlight accuracy over speed, double-check the work, ensure optimal working conditions, and naturally try to automate their processes and get rid of manual data entry.

 

Calibration processes

Calibration data includes a lot of numbers, often with many decimals. The numbers also typically fluctuate up and down with the decimals changing all the time. Very rarely is calibration data an easy to enter “even” number (20 mA is more likely to be 20.012 mA). This makes it challenging to manually enter the data correctly.

When calibrating a process instrument, for example a transmitter, the input and output data should be captured at the same time, which is difficult. If the values are drifting, additional error will be introduced if the numbers are not recorded at the same time.

In a process instrument calibration, there are typically five calibration points (25 % steps with 0 %, 25 %, 50 %, 75 % and 100 % points), and both input and output are to be recorded. This already makes 10 calibration data points. Other data also needs to be entered during the calibration, such as the reference standards used, environmental data, date, time, signature, etc.

On average we can say that 20 data points need to be entered during the calibration process. With a 1 % error rate, this means that every fifth calibration will include faulty data.

Every fifth calibration? Why is that? Because if one calibration includes 20 data points then five calibrations include 100 data points. A 1 % error rate means that data is entered incorrectly once in every 100 data points entered. So, every fifth calibration will include a faulty data entry. Every fifth calibration means that 20 % of the calibrations performed will be faulty, each including one faulty data point on average.

The above is true if the data is entered manually only once. But as discussed earlier, often the data is entered manually twice, first on paper in the field and then when it is transferred from the paper to the system in the workshop. This means that there are double the number of data entry points, with one calibration event having 40 data points instead of 20 to be entered. This means that statistically, 40 % of the calibrations made will include a faulty data entry!

Wow, so the modest-sounding 1 % error rate in manual data entry means that often 40 % of calibrations will include faulty data in practice.

To repeat: The 1 % error rate just turned into 40 %!

So, this means almost half of these calibrations will include faulty data. Well, not quite half, but 40 %; I exaggerated a little there, you got me, but it is pretty close to half.

If you do manual calibration data entry using the two-phase system, about 40 % of your calibration records will most likely have errors. Let that sink in for a while.

... a short pause for sinking... :-)

In a typical process site that performs 10,000 calibrations annually, all manually entered using the two-phase data entry process, statistically they will have 4,000 calibrations with faulty data!

Wow, that escalated quickly!

Naturally, the calibration process may be way more complicated and may contain many more data points.

If a calibration process of an instrument includes 100 data points and the results are manually recorded, a 1 % error rate means that statistically every calibration includes one faulty data entry! So statistically, 100 % of the calibrations include faulty data point!

 

Significant or insignificant error?

The significance of error varies according to the situation.

If the manually entered calibration data is wildly inaccurate it is likely going to be noticed at some point. For example, if the nominal 4 mA zero point of a transmitter is entered as 40.02 mA (wrong decimal point) that will most likely be noticed at some point, at the latest when the data is entered into the calibration system, assuming the system gives a warning when the error is too big.

But what to do then? Do you consider that it is ok to move the decimal and assume it is then correct, or does the calibration need to be repeated – which means going back to field and doing the calibration again.

If the error is small enough, it may not be noticed anywhere in the process. Using the previous example, if the transmitter’s zero point is erroneously recorded as 4.02 mA when it was actually 4.20 mA, that error may not be noticed at all. Even if the transmitter’s current of 4.20 mA would be out of tolerance, which should be noticed and corrective actions taken, it will not be noticed because the erroneously entered 4.02 mA is a good enough reading and the calibration will pass without any further action. This leaves the transmitter in the process continuously measuring with a too-large error.

So, in the worst-case scenario, human error in manual data entry will lead to a situation where a faulty calibration is considered being passed!

 

Unintentional or intentional error?

Most human errors in manual data entry are naturally unintentional.

It is anyhow not totally impossible that sometimes the calibration data would be intentionally entered incorrectly. Manual data entry gives the opportunity to falsify results, and it is almost impossible to stop that.

If the results are on the limits of being a pass or fail, it is possible that in some cases the data is entered so that it is a pass. Maybe a fail result would cause a lot of extra work, and maybe it is already late in the afternoon and time to go home.

If you see for example a pressure transmitter calibration certificate with a pressure reading of 10.000 psi (or bar) and a current reading of 20.000 mA, it is probably too good to be true.

I apologize for bringing up this kind of possibility, but this kind of information may be found in some publicly available audit reports. This is also something the US FDA (Food and Drug Administration) pays attention to when auditing the pharmaceutical industry.

But let’s assume that the errors are unintentional human errors.

Manual data entry is still being used in surprisingly many calibration processes, even in highly regulated industries such as the pharmaceutical and food and beverage industries, nuclear power, and many others.

When entering data manually on a paper form, the paper form will not automatically alert the user if the entered data is outside of accepted tolerances. It is up to the user to notice it. The calibration system often has an alarm if the entered data is outside of accepted tolerances. At that point the calibration is already done, and it needs to be redone.

 

Would this error rate be accepted in other situations?

If we use manual data entry in our calibration processes and accept the risk of error that comes with it, would we accept the same error rate in other applications?

Would we accept that our salaries don’t always come on time or are wrong? Or that our credit card repayments have a big error rate?

Obviously, these applications rely on electronic not manual data entry.

In most applications we would simply not accept the kind of error rate that comes with manual data entry. But like I said, many people still accept it in their calibration data entry process.

This article has about 15,000 characters, so with manual writing there would be about 150 errors (with a 1 % error rate). Well, frankly with me writing, there would be a lot more:-)

But luckily, we can use computer with spellchecking and the text is also proofread by colleagues. But I am sure there are still some errors. In this text the errors don’t have serious consequences as they do with calibration data.

At the same time, industry is moving fast towards the world of digitalization, where data is more important than ever and decisions are based on the data. We should also take a good look at the quality and integrity of the data!

 

To download this article as a free pdf file, please click the image below:

New Call-to-action

 

There has to be a better way!

What if you could avoid all human errors related to manual calibration data entry?

What if you could even avoid the intentional errors?

What if, at the same time, you could make the data entry process much faster, saving time?

What, you may ask, would be the cost for such a system? Can you afford it?

In return I would ask what are the costs of all the errors in your calibration data? What would be the value of such a system to you? Can you afford to be without it?

There has to be a better way.

 

There is a better way – the Beamex way!

So, what about the Beamex way? What is it?

With the Beamex integrated calibration solution, you can replace manually entering calibration data with the most highly automated calibration data collection on the market.

In a nutshell, the Beamex system comprises calibration softwaredocumenting calibrators, and mobile data-entry devices communicating seamlessly. Also, the calibration software can be integrated with your maintenance management system (CMMS) to enable a paperless automated flow of calibration work orders from the CMMS to the calibration software and acknowledgement of the work done from the calibration software to the CMMS.

It all starts from you planning the work in the CMMS or the calibration software. When it is time to perform the calibration the work orders are synchronized to documenting calibrators or to mobile devices (phones or tablets).

In the field, when you do the calibration the calibration data is stored automatically in the documenting calibrator or manually entered on a mobile device.

If you work in highly regulated environment, mobile devices can be provided with additional data security functions to ensure the integrity of the data. The Beamex calibration solution fulfills the requirements of 21 CFR Part 11 and other relevant regulations for electronic records, electronic signatures, and data integrity.

This lowers the risk of ALCOA (data integrity) violations by identifying those using offline mobile devices by their electronic signature and by protecting the offline data against tampering, eliminating the possibility to falsify calibration records.

From the mobile devices, the calibration data can be synchronized back to the calibration software for storage, analysis, and certificate generation.

The calibration software can also send an automatic notification to the CMMS when the work is done.

 

Here's a short video on how the Beamex integrated calibration system works:

 

Learn more about Beamex products and services on our website or contact your local Beamex representative:

Visit Beamex website

Contact Us

Beamex Worldwide Contacts

 

Download your copy of the Calibration Essentials Software eBook, to learn more about calibration management and software.

Calibration Essentials- Software eBook

 

 

Related blogs

If you found this article interesting, you might also like these articles:

 

 

Topics: Calibration process, Calibration management

How to choose a calibration laboratory - 13 things to consider

Posted by Heikki Laurila on Feb 23, 2021

Beamex Temperature Calibration Laboratory

 

So you have invested in some new, accurate calibration equipment. Great!

But as with many other things in life, also accuracy fades over time.

To ensure that your calibration equipment serves you well and stays accurate throughout its lifetime, it needs to be recalibrated periodically. It also needs to be serviced and adjusted whenever necessary.

When choosing a calibration laboratory or calibration service, you need to select one that is capable of calibrating your accurate equipment with sufficient uncertainty.

We have seen the accuracy of a calibrators being destroyed in a non-competent laboratory. I want to help you to avoid that.

What do you need to consider when choosing a calibration laboratory?

In this blog post I will discuss the most important things to consider when choosing a calibration laboratory for your precious calibrator or reference standard.

 

Table of Content

Background

How to choose a calibration laboratory - 13 things to consider

  1. Manufacturer’s laboratory
  2. Laboratory accreditation
  3. Calibration uncertainty
  4. Calibration certificate
  5. Pass/Fail judgment
  6. Adjustment
  7. As Found / As Left calibration
  8. Turnaround time
  9. Brand and reputation
  10. Price
  11. Repairs, service, and maintenance
  12. Warranty
  13. Agreements and reminders

What do we do at the Beamex calibration laboratory?

Beamex Care Plan

Beamex Service Portal

 

Download this article as a free pdf file by clicking the picture below:

New Call-to-action

 

Background

To match the ever-improving accuracy race of process instrumentation, also the calibration equipment is getting more and more accurate. This puts more pressure on the accuracy of the calibration laboratories, and they also need to improve their accuracy to match these requirements with sufficient accuracy ratio.

Many modern process calibrators are multifunctional, containing several quantities and multiple ranges. This is great for users as they only need to carry one multifunctional calibrator with them in the field.

But multifunctionality makes recalibration more challenging for the calibration laboratory. Not all laboratories can calibrate multiple quantities and ranges with sufficient accuracy and uncertainty.

Even if you choose an accredited calibration laboratory it will not always offer the required uncertainty for all the ranges.

Something that we sometimes see in our calibration laboratories at the Beamex factory is that customers have bought the most accurate and multifunctional calibrator we offer (for example, the Beamex MC6) and it has been calibrated in a local calibration laboratory. The MC6 calibrator is packed with several accurate pressure, electrical, and temperature ranges, so is not the easiest to recalibrate. In some cases, the laboratories have claimed that the calibrator does not fulfill its accuracy/uncertainty specifications, but when the case is investigated it is commonly found that the laboratory’s uncertainty is worse than the calibrator’s uncertainty!

And even worse, we have also seen local labs adjusting the calibrators with the intention of making them ‘more accurate’. Next time our calibration laboratory calibrates the calibrator, it is discovered that the unit was adjusted incorrectly and it is out of specifications! In some cases the customer has been already using an out-of-spec calibrator for some time, which can have serious consequences.

I therefore wanted to discuss the topic of choosing a suitable calibration laboratory in this article.

 

How to choose a calibration laboratory - 13 things to consider

 

1. Manufacturer’s laboratory

One good way to choose a calibration laboratory is to use the equipment manufacturer’s laboratory, if that is practical. The manufacturer knows anyhow all the ins and outs of the equipment and has the capability to calibrate it. The manufacturer can also do any service or maintenance work that may be required. Also, using the manufacturer’s calibration service does not jeopardize the warranty of the equipment; they may even offer an extended warranty.

It is however not always possible or practical to use the manufacturer’s calibration laboratory, so let’s discuss some other considerations.

 

2. Laboratory accreditation

Choosing a calibration laboratory or service that has accreditation is the most important thing to start with, especially if it is not possible to use the manufacturer’s calibration laboratory.

Calibration laboratory accreditation is done by a formal third-party authority to ensure that the laboratory meets all the requirements of the relevant standards. Laboratory accreditation is so much more than “just a piece of paper”.

Formal accreditation guarantees many things that you would otherwise need to check if the laboratory didn’t have accreditation. For example, accreditation ensures, amongst other things, that the laboratory fulfills the requirements of the relevant standards, has a quality system and follows it, has appropriate operating procedures, has a training program and training records for staff, can evaluate calibration uncertainty, and maintains traceability to national standards.

Without accreditation you have to take care of all these things yourself, which is a huge task.

Calibration laboratories are commonly accredited according to the international ISO/IEC 17025 standard.

ILAC is the international organization for accreditation bodies operating in accordance with ISO/IEC 17011 and involved in the accreditation of conformity assessment bodies including calibration laboratories (using ISO/IEC 17025).

It is important to remember that accreditation does not automatically mean that the laboratory has sufficient accuracy and uncertainty to calibrate your calibration equipment!

So, even though accreditation is an important box to tick, it is not enough on its own. The burden is still on you to judge the calibration laboratory’s capabilities.

 

3. Calibration uncertainty

Even when using an accredited calibration laboratory, you need to make sure that the laboratory can calibrate your calibration equipment with sufficient and appropriate uncertainty.

There are many accredited calibration laboratories that do not offer good enough uncertainty to calibrate all the ranges of a modern multifunctional calibrator such as the Beamex MC6 family of calibrators.

If the laboratory is accredited, it will have a public “Scope of Accreditation” document listing all the uncertainties they can offer for different quantities and ranges. That should be evaluated before proceeding further.

If the laboratory is not accredited, you will need to discuss with the laboratory to find out what kind of uncertainty they can offer and if it is sufficient for your needs.

The calibration uncertainty needs to be documented on the calibration certificate. It is then up to you to decide what kind of uncertainty ratio you can accept between the laboratory’s calibration uncertainty and the uncertainty specification of the equipment. The most common uncertainty ratio is 1 to 4, i.e. the laboratory is four times more accurate than the equipment to be calibrated, or the laboratory’s uncertainty is only one quarter of the equipment’s uncertainty. In practice that is often not possible for all ranges, so you may need to accept a smaller uncertainty ratio.

The most important thing is to know the laboratory’s uncertainty, make sure it is better than the equipment’s specifications and ensure it is documented on the calibration certificate.

More information about calibration uncertainty can be found here:

Calibration uncertainty for dummies

 

4. Calibration certificate

The calibration certificate is the document you get from the calibration, and it should include all the relevant information on the calibration.

Again, if the laboratory is accredited, you don’t need to worry too much about the calibration certificate as an accredited laboratory will follow standards and the calibration certificate content is one of the many audited items included in the laboratory’s periodical accreditation audit.

The basic things on the calibration certificate include:

  • The title: “Calibration Certificate”
  • Identification of the equipment calibrated
  • The calibration laboratory’s contact information
  • Identification of the calibration methods used
  • Calibration data covering all the calibrated points, i.e. the laboratory’s reference standard’s “true value” and the indication of the equipment to be calibrated
  • The found error on each point, i.e. the difference between the reference standard and the calibrated device
  • The total calibration uncertainty (presented in the same unit as that of the measurand or in a term relative to the measurand, e.g. percent) including all the calibration uncertainty components, not only the reference standard, preferably calculated separately for each calibration point
  • Signature of the person(s) that performed the calibration, the calibration date, and details of the environmental conditions during the calibration process

 

5. Pass/Fail judgment

When you send your calibration equipment for calibration, you obviously want to know if the equipment fulfills its accuracy/uncertainty specifications. Although this sounds obvious, I have seen customers who have had their equipment calibrated and the calibration certificate archived without evaluating if the equipment is still as accurate as it is assumed to be.

So please make it a practice to carefully review the calibration certificate before filing it away and taking your calibrator back into use.

The Pass/Fail judgment is not all that common in calibration laboratories, accredited or not.

If the certificate does not include the Pass/Fail judgment, it is then your job to go through all the points on the calibration certificate and to compare the found error against the equipment specifications.

The calibration uncertainty also needs to be taken into account in this comparison – the equipment may be within the specifications, but when the calibration uncertainty is taken into account it is not anymore.

So, take a careful look at the found error and the total uncertainty for each calibration point.

There are different ways to take the calibration uncertainty into account in the Pass/Fail judgment. The ILAC G8 (Guidelines on Decision Rules and Statements of Conformity) standard specifies how accredited laboratories should take it into account.

This topic has been discussed in more detail in an earlier blog article:

Calibration uncertainty for dummies – Part 3: Is it Pass or Fail?

 

6. Adjustment 

When the calibration laboratory receives your equipment they will first calibrate all the ranges of the equipment and document the results on the calibration certificate. This is often called the As Found calibration.

But what if the calibration equipment is found to fail at some point(s), i.e. it does not meet its accuracy specifications?

Naturally, the laboratory needs to be able to judge if some calibration points are out of the specifications.

Does the laboratory have the capability, tools, and know-how to adjust the calibration equipment so that all the ranges are within the specifications?

Is the equipment adjusted only if it fails the As Found calibration, or is it also adjusted if there is drift and a risk that it would drift outside of the specifications by the time of the next recalibration?

Most calibration laboratories do not optimize the equipment by adjusting the ranges if they are still within the specifications but have some error. This can cause the equipment to drift out of the specifications and fail before the next recalibration.

Some calibration equipment can be difficult to adjust and may require special tools and knowledge.

If the laboratory is not able to do this kind of adjustment you will need to send the equipment elsewhere, possibly to the manufacturer. This will obviously result in a delay and add costs.

If the laboratory can make the required adjustment, will it mean additional costs for you?

You should find out whether the laboratory can perform the required adjustments before sending your equipment for calibration.

This goes for accredited and non-accredited calibration laboratories alike.

 

7. As Found / As Left calibration

If the adjustment mentioned in the previous section is done after the As Found calibration, the equipment needs to be calibrated again after the adjustment is done. This is called the As Left calibration.

Will the calibration laboratory perform both As Found and As Left calibrations if necessary?

Are both As Found and As Left calibration included in the calibration price, or do these cost extra?

 

8. Turnaround time

The turnaround time of the calibration laboratory is another consideration. This also includes the time for transportation both ways.

You don’t want your equipment to be out of service for too long.

 

9. Brand and reputation

The calibration laboratory’s brand and reputation are also something that will affect the choice you make, especially if you don’t have previous experience of that calibration laboratory.

 

10.Price

Price is another factor in the selection process.

Don’t just compare prices, but take into account what you will get for that price.

 

11. Repairs, service, and maintenance

Is the calibration laboratory also capable of performing repairs or other maintenance, if needed?

This also includes firmware updates and other software updates.

 

12. Warranty

Is the calibration laboratory authorized to do warranty service for your equipment, if it is still under warranty?

Most likely the manufacturer’s warranty is going to be void if some other company services the equipment.

In some cases, using authorized calibration/service centers enables you to extend the warranty of your equipment without additional costs.

Is the calibration laboratory’s work covered by some kind of warranty?

 

13. Agreements and reminders

Does the calibration laboratory offer the possibility to make a continuous agreement for future calibrations?

Will the calibration laboratory send you a reminder when it is time for the next calibration?

 

Download this article as a free pdf file by clicking the picture below:

New Call-to-action

 

What do we do at the Beamex calibration laboratory?

The Beamex factory calibration laboratory in Finland has been ISO 17025 accredited since 1993 and serves as the standard for the Beamex USA calibration laboratory with over 30 years of experience in calibrations and repairs.

Since our factory manufacturing facilities are in the same location as the calibration laboratory, we have a very good set of laboratory equipment and automated calibration systems that minimize the risk of human error. It would not be realistic to have that kind of equipment only for recalibration purposes.

Please note that we currently only recalibrate Beamex manufactured devices.

Here is a short list of the things that we do at the Beamex factory calibration laboratory when we get a calibrator back for recalibration:

When a unit is received, it is properly cleaned and any minor service needs are taken care of.

The unit is then calibrated (As Found) and an accredited calibration certificate is created.

If the unit fails in some ranges, these range are adjusted; if the unit does not fail but there is some minor drift, the unit will be adjusted to improve its accuracy. If a unit passes but is close to its specification limits, it is adjusted to help prevent it drifting out of specifications by the time of the next calibration.

If any ranges are adjusted, a new As Left calibration will be carried out.

Most of the calibration work is automated, so we can offer fast and reliable service.

Finally, the firmware of the unit as well as any device description files are updated if needed.

 

Here's a short video on our recalibration services:

 

Here's a short video on our calibration laboratories:

 

Beamex Care Plan

We offer a Care Plan agreement for the calibrators we manufacture.

Care Plan is a contract for the recalibration and maintenance of Beamex equipment, ensuring the equipment stays accurate and operational throughout its lifetime.

A Beamex Care Plan includes the following services:

  • A fixed-term contract (one or three years) – a single purchase order reduces unnecessary admin work and associated costs
  • Annual recalibrations with an accredited calibration certificate (including As Found calibration, any necessary adjustments, and As Left calibration)
  • Free express shipments to and from the Beamex factory
  • Free repairs, even in the case of accidental damage
  • Replacement of wear parts
  • Annual email notification when a calibration is due – allows you to schedule your recalibration needs around any potential outages
  • Applicable updates of firmware, device description files, and so on, ensuring your device has the latest features
  • Priority help-desk services
  • Priority service – expedited turnaround times

Learn more about the Beamex Care Plan.

 

Here's a short video on our Care Plan agreement:

Beamex Service Portal

The Beamex Service Portal is an easy way for you to request a quote or return your Beamex equipment for service or calibration.

Learn more about the Beamex Service Portal.

 

Download this article now!

 

 

Topics: Calibration, Calibration process

How to calibrate a temperature switch

Posted by Heikki Laurila on Dec 02, 2020

Calibrating-temp-switch_1200px_v1

 

Temperature switches are commonly used in various industrial applications to control specific functions. As with any measuring instrument, they need to be calibrated regularly to ensure they are working accurately and reliably – lack of calibration, or inaccurate calibration, can have serious consequences. Calibrating a temperature switch is different from calibrating a temperature sensor or transmitter, for example, so this blog post aims to explain how to properly calibrate a temperature switch. Let’s start!

Table of contents

 

Before we go into details, here's a short video on this topic:

 

Download this article as a free pdf file by clicking the below image:

How to calibrate a temperature switch - Beamex white paper

 

How does a temperature switch work?

In short, a temperature switch is an instrument that measures temperature and provides a required function (a switch opens or closes) at a programmed temperature.

One of the most common temperature switches is the thermostat switch in an electric radiator. You can set the thermostat to the required temperature and if the room is colder than the set temperature, the thermostat will switch the radiator on; if the room temperature is higher than required, the thermostat will switch the heating off.

In practice, there is a small difference between the set and reset points so that the control does not start to oscillate when the temperature reaches the set point. This difference is called hysteresis, or deadband. In the above radiator example this means that when the thermostat is turned to 20 °C (68 °F), the radiator may start heating when the temperature is below 19 °C (66 °F) and stop heating when the temperature is 21 °C (70 °F), showing a 2 °C (4 °F) deadband.

Naturally, there are many different applications for temperature switches in industry.

 

The main principle of temperature switch calibration

We will investigate the details of temperature switch calibration later in this article, but to start, let’s briefly summarize the main principle to remember when calibrating a temperature switch:

To calibrate a temperature switch you need to slowly ramp the temperature at the switch input (the temperature-sensing element) while simultaneously measuring the switch output to see at which temperature it changes its state. Then you need to ramp the temperature back to find the “reset” point, where the switch reverts back to its original state.

When the output changes state, you need to record the input temperature at that exact moment.

The switch output usually only has two states, e.g. open or closed.

 

Essential terminology

One term commonly discussed is whether a switch type is normally open (NO) (or closing), or normally closed (NC) (or opening). This indicates if the switch contacts are open or closed by default. Usually temperature switches are in their default position when measuring the environmental temperature.

Operating points may also be referred to as Set and Reset points, or On and Off points.

The temperature difference between the operation points is called deadband. Some difference is needed between the closing/opening operating points to prevent the switch from potentially oscillating on and off if they work at exactly the same temperature. For applications that require a very small deadband, additional logic is provided to prevent the switch from oscillating.

The switch outputs may be mechanical (open/close), electronic, or digital.

Dry/wet switches are also sometimes discussed. Dry means that the output is closed or open, while wet means that there is a different voltage level representing the two switch states.

Some switches have mains voltage over the contacts when the switch is open. This can be a safety issue for both people and test equipment, so it should be taken into account when testing the switch.

A more detailed discussion on terminology can be found in this blog post:

Pressure Switch Calibration

 

Is your temperature sensor separate or attached?

As a temperature switch needs to measure temperature, it needs to have a temperature sensing element, in other words a temperature sensor.

In some cases the temperature sensor is a separate instrument and can be removed from the switch, while in others the sensor is fixed to the switch so they cannot be separated.

These two different scenarios require very different methods to calibrate the switch.

As explained above, you need to provide a slowly changing temperature for the switch input. This is very different depending on if the switch has a fixed temperature sensor or if the sensor can be removed.

Let’s look at these two different scenarios next.

 

#1 - Temperature switch with a separate/removable temperature sensor

In some cases, you can remove the temperature sensor from the temperature switch. The sensor will often be a common standard sensor, such as a Pt100 sensor (or a thermocouple). In these cases you can calibrate the switch without the temperature sensor by using a simulator or calibrator to simulate the Pt100 sensor signal, generating a slow temperature ramp (or a series of very small steps) as the input to the switch.

Naturally you also need to calibrate the temperature sensor, but that can be calibrated using normal temperature sensor calibration at fixed temperature set points, without needing to slowly ramp the temperature, which makes the sensor calibration much easier (and with less uncertainty).

In accurate applications, the switch may be compensating for RTD sensor error by using correction coefficients, such as ITS-90 or Callendar van Dusen, so when simulating the temperature sensor your sensor simulator should be able to take this into account.

Find out more on temperature sensor calibration in this earlier post: how to calibrate temperature sensors.

You can calibrate the sensor and switch together as a loop; you don’t have to calibrate them separately. But if you don’t have a system that generates a slow, controlled temperature ramp, it is easier to calibrate them separately.

If the removable temperature sensor is a not a standard sensor type (neither an RTD nor a thermocouple), then you can’t really calibrate the sensor and switch separately as you can neither measure nor simulate the signal of the non-standard sensor. In that case you need to calibrate them as one instrument when they are connected.

 

#2 - Temperature switch with an integrated/fixed temperature sensor

If your temperature sensor is fixed to your temperature switch and cannot be removed, you need to calibrate it all as one instrument. In that case you need to generate a temperature ramp with a temperature source that you insert the temperature sensor into.

 

How to calibrate temperature switches

Before calibration 

As with any process instrument calibration, before starting, isolate the measurement from the process, communicate with the control room, and make sure the calibration will not cause any alarms or unwanted consequences.

Visually check the switch to ensure it is not damaged and all connections look ok.

If the sensor is dirty, it should be cleaned before inserting it into the temperature block.

 

Generate a slow temperature ramp as input

If you are calibrating the temperature switch and its temperature sensor together, you need to generate a slow enough temperature ramp in the temperature source where you install the switch's temperature sensor.

This means you need to have a temperature source that can generate a controlled temperature ramp at a constant speed, as slow as the application requires.

In practice you can quickly reach a temperature set point close to the calibration range, let the temperature fully stabilize, and then start slowly ramping the temperature across the calibration range. After the calibration you can quickly return back to room temperature.

A temperature ramp like this is most commonly generated with a temperature dry block. Not all dry blocks are able to generate a suitably slow ramp. And you also need to be able to measure the generated temperature very accurately, while at the same time being able to measure the switch output signal. In addition, the calibration system should have the capability to automatically capture the input temperature at the exact moment when the switch output changes its state.

Not all temperature calibration systems can do all this, but needless to say, the Beamex MC6-T temperature calibrator can do it all fully automatically. And not only that, it can do many other things too, so please make sure you check it out!

 

Use an external reference temperature sensor – don’t use the internal one!

Temperature dry blocks always have an internal reference sensor, but do not use this when calibrating temperature switches! 

The internal reference sensor is located in the bottom part of the temperature block, which is heated and/or cooled. The internal reference sensor is also usually close to the heating/cooling elements and responds quickly to any temperature changes.

From that temperature block, the temperature will transfer to the insert and from the insert it will transfer to the actual temperature sensor. This means that there is always a significant delay (lag) between the internal reference sensor and the sensor being calibrated, located in the hole in the insert.

In a normal sensor calibration, done at fixed temperature points, this delay is not so critical, because you can wait for the temperatures to stabilize. But for temperature switch calibration this delay has a huge impact and will cause significant error in the calibration result!

Instead of using the internal reference sensor, you should use an external reference sensor that is installed in the insert together with the switch’s sensor to be calibrated. The external reference sensor should have similar characteristics to the temperature switch sensor in order for them to behave the same way, with a similar lag.

At the very least make sure that the dimensions of the reference sensor and temperature switch sensor are as similar as possible (e.g. similar length and diameter). Ensuring that the sensors have the same length means they will go equally deep into the insert, with the same immersion depth. Different immersion depths will cause error and uncertainty in the calibration.

Naturally the reference temperature sensor also needs to be measured with an accurate measurement device.

 

Measuring the switch output

Once you have the input temperature ramp figured out, you also need to measure the switch output terminals and their state.

With a traditional open/close switch, you need to have a device that can measure if the switch contacts are open or closed.

If the switch is more modern with an electrical output, you need to be able to measure that. That may be current measurement for an mA signal, or voltage measurement for a voltage signal.

Anyhow, as the switch output has two states, you need to have a device that can measure and recognize both.

 

Capturing the operation points

To calibrate manually you need to start the temperature ramp and monitor the switch output. When the switch’s status changes, you need to read what the input temperature is, i.e. what the reference temperature sensor is reading. That is the operating point of the temperature switch. Usually you want to calibrate both operation points (the “set” and “reset” points) with increasing and decreasing temperatures to see the difference between them, which is the hysteresis (deadband).

If you don’t want to do that manually, then you need a system that can perform all of the required functions automatically, i.e. it needs to:

  • Generate the temperature ramp, going up and down at the required speed, within the required temperature range for the switch in question
  • Measure the switch’s output state (open/close, on/off)
  • Measure the reference temperature sensor inserted in the temperature source
  • Capture the temperature when the switch changes state

The Beamex MC6-T can do all of this and much more.

 

Temperature switch calibration steps – a summary

Let’s finish with a short summary of the steps needed to calibrate a temperature switch:

  1. Pre-calibration preparation (disconnect from process, isolate for safety, visual check, cleaning).
  2. Insert the temperature switch’s temperature sensor and a reference sensor into the temperature source.
  3. Connect the switch’s output to a measurement device that measures the switch’s open/close status.
  4. Quickly ramp the temperature close to the switch’s operation range and wait for it to stabilize.
  5. Very slowly ramp the temperature across the switch’s nominal operation range.
  6. When the switch output changes status (set point), capture the temperature in the temperature source.
  7. Slowly ramp the temperature in the other direction until the switch operates again (reset point). Capture the temperature.
  8. Repeat steps 5 to 7 as many times as needed to find the repeatability of the switch. Typical practice is three (3) repeats.
  9. Ramp the temperature quickly back to room temperature.
  10. Document the results of the calibration.
  11. If the calibration failed and the switch did not meet the accuracy requirements, make the necessary adjustments, repair, or replace it.
  12. Repeat the whole calibration process if adjustments were made in step
  13. Connect the switch back to the process.

 

Temperature switch calibration cycle

The above graph illustrates an example temperature cycle during temperature switch calibration. In the beginning you can quickly reach a temperature point close to the calibration range, let the temperature fully stabilize, and then start slowly ramping the temperature up and down across the calibration range to capture the set and reset points. In this example three calibration repeats were done to record the repeatability of the switch. After calibration you can quickly decrease the temperature back to room temperature.

 

Documenting calibration, metrological traceability, and calibration uncertainty

A few important reminders about temperature switch calibration, or indeed any calibration:

Documentation – calibration should always be documented; typically this is done with a calibration certificate.

Metrological traceability – calibration equipment should have valid metrological traceability to relevant standards.

For more information on metrological traceability, check out this blog post:

Metrological traceability in calibration - are you traceable?

 

Calibration uncertainty – calibration uncertainty is a vital part of every calibration process. You should be aware of how “good” your calibration process and the calibration equipment are, and if the process and equipment provides low enough uncertainty for the calibration in question.

For more information on calibration uncertainty, please check this blog post:

Calibration uncertainty for dummies

 

Related blogs

If you found this post interesting, you might also like these blog posts:

 

Beamex solution for temperature switch calibration

Beamex provides a fully automatic system for temperature switch calibration. The heart of the solution is the Beamex MC6-T temperature calibrator. The MC6-T is an accurate and versatile temperature calibrator with built-in multifunction process calibrator and communicator technology.

Calibrating-temp-switch_800x800px_v1

 

With the MC6-T you can create the required temperature ramp, measure the switch output, measure the reference temperature sensor, and capture the operation points. And all of this can be done fully automatically. The calibration results are stored in the MC6-T’s memory, from where the results can be uploaded to Beamex CMX or LOGiCAL calibration software, for storing results in databases and generating calibration certificates. The whole calibration process is automatic and paperless.

Please feel free to contact us to learn more about the MC6-T or to book a free physical or virtual online demonstration:

Contact us (Global)

Find you local Beamex partner

 

 

Topics: Temperature calibration

CMMS and calibration management integration - Bridging the gap

Posted by Tiffany Rankin on Oct 22, 2020

Black and Grey Bordered Travel Influencer YouTube Thumbnail Set (1)

Recently, Patrick Zhao, Corporate Instrument & Analyzer SME for Braskem America, spoke at the Beamex Annual Calibration Exchange. He presented on Braskem’s, the largest petrochemical company in the Americas, integration of their computerized maintenance management systems and calibration management software. The presentation was so well received that we wanted to share some of the highlights with you and also provide a link to the full video recording, found below.

Watch the presentation video recording now! 

 

Braskem, a Beamex customer, uses MC6 calibrators to calibrate field instruments, Beamex CMX software, and the Beamex bMobile solution to perform electronically guided function tests.

In order to improve the automation of their maintenance and calibration work process, they choose to integrate the Beamex calibration software into their plant maintenance management software, SAP and Maximo, using a business bridge. 

A business bridge simply allows communication between the maintenance management software (SAP and Maximo, in this case) and the calibration software (Beamex CMX) via an XML (Extensible Markup Language) data file format. 

This enables the sharing of structured data, including position ID, location, serial number, and work order numbers, across the different information systems.  

With the implementation of this business bridge, Braskem has reduced manual interventions and human error. Additionally, their management team can now see all necessary data related to compliance reporting and overall calibration in one place.

Prior to Beamex, Braskem had a pen and paper calibration process. That process consisted of an Instrumentation and Electrical (I&E) Technician being assigned a work order, writing down the results, turning the results into an I&E Supervisor who would scan it to a pdf document, and then mark the SAP work order as completed. Patrick notes that, “A lot of times that calibration data gets lost. That piece of paper gets lost."

PenandPaperCalibrationProcess

 

After Beamex was implemented, prior to the Maximo bridge integration, a similar process was used. Calibration Tech would be assigned a work order and they would use a Beamex calibrator out in the field to perform the calibration. From here, Beamex CMX could automatically send an email to the Process Maintenance Coordinator (PMC) who would then close the SAP work order. An I&E Approver would have to go back manually into Maximo and scan the Beamex calibration certificate into a pdf and manually attach that pdf to the appropriate work order to close it. 

According to Patrick, this could take anywhere from 10-20 minutes per calibration. 

CalibrationSoftwarewithoutMaximoBridge

 

With the implementation of the business bridge between Maximo and Beamex, once the calibration is completed, Beamex sends an email to the PMC. The I&E Approver logs into Beamex CMX software and clicks on an approve button, once he enters his username and password, which serves as an electronic signature, the calibration results are automatically sent to Maximo and the appropriate work order is automatically completed.

 “We save about 20 minutes per calibration/work order with this integration.” states Patrick.

CalibrationSoftwarewithMaximobridge

 

Overall system integration

For Braskem, system integration combined three programs; SAP used for functional location, equipment data, task lists, notification, and work orders. Maximo also has functional location and equipment, as well as the maintenance plan/scheduler, work orders, calibration data, and compliance reports. Beamex for position and device, function templates (including function and procedure), work orders, and calibration results.

The Beamex business bridge created a link between Maximo, which was already linked to SAP, and the Beamex software. By using a web or file service (Braskem uses a web service) which is XML-based, Maximo can speak to the web service which in turn talks to Beamex. Similarly, Beamex can talk back to the business bridge which goes to the web service and then back to Maximo.

 

Key benefits of integration

According to Patrick Zhao, the three key features of this integration are:

1. They can see SAP data inside of Beamex

SAP and Maximo synchronize function location and equipment each night. With the Beamex bridge, they can synchronize the function location into the position and synchronize equipment into what’s called device inside of Beamex. Maximo also handles work orders as part of the maintenance plan and these work orders can be seen both inside of SAP and Beamex. 

DirectlySeeSAPData

 

2. They can automatically generate calibration procedures based on templates

Inside of SAP, in the equipment data, they set up the equipment category. I is for instruments, E for electrical, and B for Beamex compatible instruments. All equipment marked B can synchronize into Beamex. They also use the calibration profile in SAP. This defines what type of instrument it is inside of SAP. The same code is used inside of Beamex so Beamex can pick the correct function and calibration procedure template based on what the equipment is set up as in SAP. For example, if you have a pressure transmitter catalog profile then Beamex knows what it is and can automatically pick the template for a pressure transmitter.

Auto-generateCalibrationProcedures

 

3. The ability to auto-complete a work order based on the calibration data 

is the third feature, which Patrick Zhao refers to as the “key to how this thing works”.   As before, Maximo generates a maintenance plan and a work order, the I&E Tech completes the work, sends an email notifying the approver and the approver logs into Beamex, reviews the calibration result and clicks approve. Once the approved button is clicked the data is automatically sent back to Maximo. But now, this is also seen within SAP, which automatically resets the maintenance plan and generates a new date.

Auto-completeWO

 

Patrick then shared a Beamex calibration result and screenshot of the Maximo work order calibration record. This area of Maximo was custom programed for Braskem and customized for Beamex. You can see the calibration number of the Beamex calibration result, the ‘as found’ and ‘as found’ error as a percentage. You can also see ‘as left’, an overall pass or fail and the actual finish date which is the same as the Beamex calibration result. 

Auto-completeWO2

 

Conclusion

In conclusion, Patrick states, “Beamex Maximo bridge integration is very critical to our plant maintenance work process. We have had it for two years now and it’s been working very well. We had a lot of support from Beamex. They’re very responsive and it was very pleasant to work with Beamex to get this working.” 

The implementation of this integration means that the Engineering Approver can easily review the data and complete the calibration related maintenance plan work process using just the Beamex CMX software. There is no longer a need to log into Maximo and manually enter the data. 

Braskem installed the standard Beamex CMX software and hired a programmer to program everything on the Maximo side to be able to take the data. The same can be done for SAP. For Braskem, it took approximately 2.5 weeks to complete the programming.

“Braskem America plants are using this Beamex calibration system and Maximo bridge integration every single day to ensure our critical plant instruments are functioning properly, in top performance and that the plant can run safely and produce a polypropylene product for our customers.”

Learn more about how Patrick Zhao and Braskem America have bridged the gap between maintenance management systems and calibration management software by watching his Annual Calibration Exchange presentation. Be sure to stay tuned for the Q&A portion of the presentation for additional insight into the process. 

Watch the video presentation!

Watch Now

 

Looking to implement calibration management software into your infrastructure? 

Contact Beamex at:

 

 

 

 

Topics: Calibration software, CMX

Temperature Calibration Webinars

Posted by Heikki Laurila on Sep 23, 2020

Temperature-webinars-2020_1500px_v1

We have recently done two webinars on temperature calibration; one was done by Beamex, Inc. in USA and the other by Beamex Ltd in UK.

As both webinars discuss temperature calibration, we will share both of them here in the same blog post.

I have done a table of content for both webinars so you can easily see what is included and can jump quickly to the interesting point.

Both webinars include a Live Demo session, demonstrating a fully automatic calibration of temperature sensor.

You can find free webinar recordings and info on upcoming webinars on our webinars page.

 

Basics of Temperature Calibration - webinar

This webinar was done in April 2020 by Beamex, Inc. in co-operation with Chemical Engineering. The presenters are Ned Espy and Roy Tomalino.

Webinar content:

TimeTopic
0:00Welcome, introduction, housekeeping
2:05Presentation of speakers
5:00Webinar agenda
6:00Quick Poll
7:15Temperature terminology
12:15Dry Block vs. Liquid Bath
14:00Best practices
21:00Live Demo - Automatic calibration of a temperature sensor
50:00Quick Poll
52:15Questions and Answers

 

Watch the webinar!

 

 

Temperature Calibration in the field - webinar

This webinar was done in June 2020 by Beamex Ltd, presenters Andy Morsman and Ian Murphy.

TimeTopic
0:00Welcome, introduction, housekeeping
0:55Introduction of speakers
2:00Webinar agenda
03:55Temperature terminology
12:00Dry block structure
15:30Best practices
26:10RTD and PRT probes
28:15Thermocouples
30:30Live Demo - Automatic calibration of a temperature sensor
53:00Questions and Answers

 

Watch the webinar!

 

 

Other content on temperature calibration

If you are interested in temperature calibration, you might like these blog posts:

 

Beamex solution for temperature calibration

Beamex offers many solutions for temperature calibration. The webinars did already show the Beamex MC6-T temperature calibrator in action. Also other MC6 family products can be used for temperature calibration. 

We also offer different reference sensors for temperature calibration.

The calibration software - both Beamex CMX and Beamex LOGiCAL - can be used for temperature calibration.

Please check the list of Beamex temperature calibrators.

 

 

Topics: Temperature calibration

Sustainability in Energy from Waste

Posted by Heikki Laurila on Aug 11, 2020

edited ERF

Waste not, want not; a phrase coined to denote resourcefulness, the idea of utilising what we have to reduce waste. No one likes to be wasteful if they can help it, whether it’s food, time, money, energy… The list is endless. Being sustainable is all about using what we have. But what happens when we do need to dispose of our unwanted goods? Our recyclables and our waste? How can this process be optimised in order for it to be as sustainable as possible?

There are 4 Pillars of Sustainability:

  • Human
  • Social
  • Economic
  • Environment

If you would like to read more about the '4 Pillars of Sustainability', you can do so in our earlier blog, 'How Calibration Improves Plant Sustainability', however in this article, we will be focusing on the 'Environment' pillar. As the name suggests, this is about how we can collectively work towards being more sustainable, environmentally friendly and using the resources that we have to find ‘a better way’.

Environmental Sustainability and Energy from Waste?

So how can environmental sustainability be applied to waste disposal?

Energy from Waste (EFW) is the process of burning any combustible municipal waste which then generates energy in the form of heat and electricity. A byproduct of this process, ash, is recycled as an aggregate to the construction industries (it is typically used in the production of tarmac).

At a first glance, burning waste appears to be an unethical and unsustainable process, but in reality, there are a number of stringent rules that Energy Recovery Facilities (ERF) have to adhere to before any gasses or water vapour are released into the environment. The European Industrial Emissions Directive, or your country equivalent, enforce strict rules to ensure the EFW process is conducted under controlled conditions with cleaned emissions.

Below is a diagram of how an ERF operates:

Beamex Energy from Waste image 1

The residual waste is offloaded and burnt in a furnace at a high temperature of +850 °C (1560 °F), the optimum temperature where materials will combust and the formation of pollutants, such as dioxins, are minimised. The heat creates steam which is used to drive a turbine linked to a generator that produces electricity; this is then exported to the local electricity grid where the heat and electricity generated is used for domestic and industrial purposes. The byproducts at the end, such as the ferrous and non-ferrous metals and the bottom ash are recycled.

The flue gases are cleaned during the process using a Scrubber and chemical cleaners which convert the noxious gases into clean emissions. Continuous Emission Monitoring Systems, or CEMS, are sensors which operate within the stack to ensure that the final emission of Sulphur Dioxide, CO2, Carbon Monoxides and other pollutants released into the environment are minimised.

Calibration and Efficiency in ERFs

Beamex in the EFW Process

The incinerator needs to operate at a high temperature accurately in order for the combustion process to be efficient, maximising energy production and minimising waste products. Functional safety systems also rely on accurate information; any inaccuracy of the instruments that control or monitor the plant can cause higher emission levels that enter the atmosphere or can compromise the operation of the functional safety system. Typical instruments used would be Thermocouples or RTD type temperature probes, pressure transmitters and flow transmitters which are used to measure and control the incinerator.

The stack contains sensors which measure the PH level in the gases; these require regular calibration in order to ensure that any gas byproduct is clean and to mitigate the possibility of unburnt gases being released into the environment. Increased emission levels can result in penalty charges, loss of R1 certification, loss of environmental permits and even plant closure. With frequent calibration, it ensures that regulatory requirements are being adhered to.

Beamex Solution

The Beamex MC6 calibrator can be used to calibrate all process control instruments to ensure high levels of accuracy for optimum performance resulting in higher efficiency and reduced levels of C02 and other toxic gases entering the atmosphere. The MC6 can also be used to record proof-checking operations of the functional safety instrumentation.  

The Beamex bMobile Application can be used for recording the calibration of the CEMS which can then be uploaded into CMX, the calibration management software. This provides a secure repository of traceable data which can be documented for regulatory purposes and to also provide a clear calibration history for technicians to ensure that their processes are performing at an optimal level.

So… Waste not, want not?

Perhaps a little ironic and not the most apt notion when referring to Energy from Waste and Energy Recovery Facilities, but the same sentiment can be applied -  Yes, the waste is being disposed of, but it’s about utilising what we have to be more sustainable, resourceful and as environmentally friendly as we can be. EFW conserves valuable landfill space, it reduces greenhouse gases, the process helps to generate clean energy for domestic and industrial purposes, the byproducts can be recycled and with regular calibration, it ensures that the process is efficient.

Beamex and Sustainability

At Beamex, we pride ourselves on being a sustainable and responsible business. We have recently received a Silver Ecovadis rating for our Corporate Social Responsibility efforts and are continuing to work hard to progress further with this great accolade. We have 5 Sustainability Principals that we follow, our Environmental one focuses on: ‘Care for our environment and respect for ecological constraints’, you can read more about our principals and what sustainability means to us here.  

Topics: sustainability

Sanitary temperature sensor calibration

Posted by Heikki Laurila on Jun 23, 2020

Sanitary temperature sensor calibration - a Beamex blog post

 

Sanitary temperature sensors are commonly used in many industries, such as Food and Beverage, Dairy, Pharmaceutical and Life-science. In this post I will take a look at what these sanitary temperature sensors are and how they differ from common temperature sensors.

The calibration of sanitary sensors is different and way more difficult than calibrating normal temperature sensors. In this blog post, I will be discussing the considerations that should be taken into account when calibrating these sensors; it is easy to make mistakes that will cause big errors in the calibration results.

So if you are calibrating sanitary temperature sensors, you should take a look at this blog post.

Sure, there is also some educational content for everybody interested in temperature calibration.

Let's dive in!

 

Download this article as a free pdf file >>

 

Table of content

What are sanitary temperature sensors?

The role of calibration

Why sanitary sensors are difficult to calibrate?

  • Sensors are very short
  • Sensors often have a clamp connection with a flange

Liquid bath or a dry-block?

  • Liquid bath pros and cons
  • Dry-block pros and cons

How to calibrate in a temperature dry block

  • Using a reference sensor
  • Using internal reference sensor
  • Using a dedicated short reference sensor
  • Short sensor without a clamp connection

Documentation, metrological traceability, calibration uncertainty

Beamex solution for short sanitary sensor calibration

Related blog posts

 

Before we get into the details, here's a short video appetizer on this topic:

 

 

What are sanitary temperature sensors?

Let’s start by shortly discussing what these sanitary temperature sensors are.

Temperature is one of the critical process parameters in many industries and the accurate temperature measurement in processes is a crucial consideration.

Food and Beverage, Dairy, Pharmaceutical and Life-science industries have additional requirements for the temperature measurement sensors because of their processes. They require temperature sensors that are “sanitary”, meaning that these sensors need to be suitable to be installed in hygienic and aseptic process environments.

These sensors need to be hygienic and designed to be easy to clean, often supporting the clean-in-place (CIP) process (cleaning without disassembly).  The mechanical design needs to be free from any cavities, dead-pockets, gaps or anything that would complicate the hygienic cleaning.

Surface finishes of these sensors are hygienically graded and need to meet the strict standards in these industries, such as the 3-AR  (https://www.3-a.org/) or  EHEDG (European Hygienic Engineering & Design Group) https://www.ehedg.org/ .

The material of the wetted parts in these sensors is often high-grade stainless steel, suitable for these applications.

One very common feature in these sanitary temperature sensors is that they are typically very short. This makes the calibration way more difficult than with normal temperature sensors.

Another thing that makes the calibration difficult is the large metallic flange needed for the clamp installation.

The temperature ranges typically go up to around 150 °C (300 °F), or in some cases up to 200 °C (400 °F), so that is not very challenging.

More on these calibration challenges in the following chapters.

Sanitary temperature sensor calibration - a Beamex blog post

Back to top ↑

The role of calibration

In any industry, it is vital that the process measurements do measure correctly and as accurately as designed. This can be achieved with the help of suitable process instruments and with a proper calibration program.

Within the Food and Beverage, Pharmaceutical and Life-science industries, the calibration plays even more important role than most other industries. In these industries, the consequences of a bad or a failed calibration can have really dramatic effect, as we talk about consumer and patient health and safety. As failed calibration can be very costly in these industries, it has to be avoided by all means.

These industries also have dedicated strict regulations concerning calibration, such as various FDA regulations.   

More generic information about calibration can be found in other articles in this blog and on the page What is Calibration?

 

Back to top 

Why sanitary sensors are difficult to calibrate?

Let’s discuss next why these sanitary sensors are difficult to calibrate.

 

1. Sensors are very short

As mentioned earlier, these sanitary temperature sensors are typically very short. Most often less than 100 mm (4 in), typically around 50 mm (2 in), but can also be as short as 25 mm (1 in).

The outer dimension of the sensor typically is 3 mm (1/8 in) or 6 mm (1/4 in).

The commonly used practice in temperature calibration (and an Euramet guideline recommendation) is that a temperature sensor should be immersed deep enough to achieve sufficient accuracy. The recommendation is to immerse into a depth that is 15 times the sensor diameter (plus the length of the sensor element). But with these short sanitary sensors, it is simply impossible to immerse the sensor into sufficient depth during the calibration, because the sensor is so short compared to the diameter.

For example, a typical sanitary sensor with a diameter of 6 mm (1/4 in) should be immersed (15 x 6 mm) into at least 90 mm (3.5 in) depth during the calibration, to ensure accurate results. But if that 6 mm (1/4 in) sensor has a length of only 50 mm (2 in), sufficient immersion is simply not possible.

When not immersed deep enough, additional error and uncertainty will be caused in the calibration.

On an earlier blog post, how to calibrate temperature sensors, our temperature calibration lab people gave these rules of thumb for the immersion depth (when calibrating in liquid bath):

  • 1% accuracy - immerse 5 diameters + length of the actual sensing element inside the sensor
  • 0.01% accuracy - immerse 10 diameters + length of the sensing element
  • 0.0001% accuracy - immerse 15 diameters + length of the sensing element

The “accuracy” in the above rule is to be calculated from the temperature difference between the block temperature and the environment temperature.

 

Example: if the environment temperature is 20 °C and the block temperature is 120 °C, there is a 100 °C difference. If you then immerse the probe only 5 times the dimension (plus the sensing element length) – say you have 6 mm probe with a 10 mm sensing element inside of it - and you immerse it 40 mm (5 x diameter + sensing element) - you can expect about 1 °C error due to the low immersion (1% from 100 °C).

 

Picture: The below picture illustrates the commonly used relationship rule between thermometer immersion depth (in diameters) and the relative error of the temperature difference (of the temperature block and environment temperatures). So if you don't immerse at all, you naturally get a 100% error, and if you immerse deep enough the error caused by immersion becomes insignificant. Somewhere around where the immersion is 5 times the dimension, the error is about 1% of the temperature difference:

graph - error vs immersion

 

This rule of thumb can become quite significant at higher temperatures and/or for extremely short sensor lengths. So, keep this in mind with sensors less than 40 mm or 1-1/2 inches. Also, it may be worth having a conversation with a design engineer to figure out a way to increase the sensor length.

Naturally this accuracy limitation is valid also when the sensor is installed in the process and measuring the process temperature - being too short, the sensor is not able to accurately measure the process temperature!

It is not always easy to know the length of the actual sensing element inside the probe. If that is not mentioned in the datasheet, you can ask the manufacturer.

So how to calibrate these short sensors that can not be immersed deep enough?

This will be discussed in later chapters.

 

2. Sensors often have a clamp connection with a flange

As mentioned in the previous chapter, these sanitary sensors are too short compared to their diameter to enable a proper immersion causing temperature leaks, adding error and uncertainty to the calibration.

Like this would not be enough, these sensors also often have a so-called clamp connection (Tri-clamp, ISO 2852, DIN 11851, DIN 32676, BS 4825, Varivent, etc.) configuration, so there is a relatively large metallic flange, that is causing temperature to conduct / leak from the sensor to the flange. In practice, this temperature leak means that the temperature from the sensor is conducting to the large metallic flange, so the flange causes the sensor to read a bit of a lower temperature (when calibrating temperature higher than environment temperature).

Sanitary temperature sensor calibration - a Beamex blog post

This kind of flange makes the calibration more difficult is several ways:

First, the flange adds temperature leak from the sensor to the flange, the more the bigger the flange is, the more the bigger the temperature difference is to the environment temperature.

While at the same time the sensor is very short, this temperature leak causes the sensor to measure erroneous temperature.

Back to top 

Liquid bath or a dry-block?

Generally, you can calibrate temperature sensors in a liquid bath or in a dry-block. This is also the case with the sanitary temperature sensors.

Let’s discuss next what these are and what are the main pros and cons of both.

 

Liquid bath

As the name suggests, a temperature liquid bath has liquid inside. The liquid is heated / cooled to the required temperature and the temperature sensors to be calibrated are inserted into the liquid. Often the liquid is stirred for even temperature in the liquid.

 

Liquid bath pros and cons

A liquid bath makes it easier to insert any shape of sensors in it and you can also use a reference probe inserted at the same time. Depending on the size of the liquid bath, you may insert several sensors to be calibrated at the same time. In case the sensor to be calibrated is an odd shape, a benefit is that it will still fit inside the liquid bath.

A liquid bath often enables better uniformity and accuracy than a dry-block due to better heat transfer of liquid.

So, this starts to sounds like a favorable option?

A liquid bath has anyhow several drawbacks as to why it is not always the best option:

  • A liquid bath always includes some sort of liquid, such as silicone oil, and often you don’t want to contaminate the sanitary sensor in such a liquid. There is a lot of cleaning after the calibration to ensure that the sensor is clean when installed back into the process.
  • Handling of hot oil is dangerous and any spills may cause injuries.
  • Any oil spills make the floor very slippery and can cause accidents.
  • Liquid baths are very slow. Even if it could fit several sensors in at the same time, it is often several times slower than a dry-block, so overall effectivity is not really any better. Sometimes people may have several baths, each set to different temperature, and they move the sensors manually between the baths to skip the waiting time of the bath to change temperature. This may work in a calibration laboratory but is naturally a very expensive way to calibrate.
  • The sanitary sensor should be placed so that the surface of the liquid touches the bottom of the flange, but in practice this is not always that easy to do. For example silicon oil has pretty large thermal expansion, it means that the surface level is changing slightly as the temperature changes. So, you may need to adjust the height of the sanitary sensor during the calibration. Also, due to the stirring of the liquid, there are small waves on the surface and the liquid level is often deep in the bath, so it is difficult to see that the sensor is at the right depth.
  • Liquid baths are often large, heavy and expensive equipment.

 

Dry-block

A temperature dry-block (or dry-well) is a device that can be heated and / or cooled to different temperature values, and as the name suggests, it is used dry, without any liquids.

 

Dry-block pros and cons

As the earlier chapter discussed the pro and cons of a liquid bath in this application, let’s look at the same also for the dry-block.

The main pros of calibrating the sanitary sensor in a dry-block include:

  • As it is a dry, it is also clean and does not contaminate the sanitary sensor to be calibrated. Sure, the sensor should still be cleaned after calibration, but the cleaning is way easier than with a liquid bath.
  • A dry-block is also a very fast to change temperature.
  • When using a dedicated insert with proper drillings, it is easy to insert the sanitary sensor always the same way (no adjustments), and the calibration is repeatable every time and with different users.
  • A dry-block is light and easy to carry compared to liquid bath.
  • Typically, a dry-block is also cheaper than a liquid bath.

On the downside, a dry-block is a less accurate than a liquid bath, it typically only calibrates one sanitary sensor at a time and needs different inserts drilled for different diameter sensors.

Despite these downsides, customers often prefer to make the calibration of their short sanitary sensors in a dry-block.

So, let’s discuss next the different considerations when calibrating in a dry-block.

 

Back to top 

How to calibrate in a temperature dry block

To calibrate these short sanitary sensors in a temperature dry-block, there are a few considerations to take into account.

 

Using a reference sensor

Firstly, when you do the calibration in a temperature dry block, the flange of the sanitary sensor makes it impossible to use a normal external reference sensor in the same insert because it simply does not fit in, the flange covers the insert top and all the holes in the insert.

 

Picture:  Comparing calibration of a normal (long, no flange) temperature sensor using a reference probe on the first picture, with a short sanitary sensor with a flange on the second one. We can see that the short sensor flange covers all the holes in the insert, so it is not possible to insert a normal reference temperature probe:

 

Sanitary temperature sensor calibration - a Beamex blog post  Sanitary temperature sensor calibration - a Beamex blog post

 

Using internal reference sensor

The dry-block always include an internal reference sensor. Trying to use the internal reference sensor in the dry-block just does not work, because the internal reference sensor is located close to the bottom of the temperature block, and the short sensor to be calibrated is located in the very top part of the insert. The dry-blocks typically control the temperature gradient on a limited range in the bottom of the insert. The top part of the insert typically has a larger temperature gradient, so the top of the insert does not have the same temperature as the bottom of the insert. The size of the gradient depends on the temperature difference between the insert and environment, and how deep you go in the insert.

 

Picture: The internal reference sensor is located in the bottom of the temperature block, while the short sanitary sensor is located in the very top part of the insert. There is a temperature gradient in the insert, causing the top of the insert being different temperature than the bottom. This is causing error in calibration:

Sanitary temperature sensor calibration - a Beamex blog post

 

 

Using a dedicated short reference sensor

As the internal reference sensor in the bottom of the dry-block is not suitable, we need to use a dedicated external reference temperature sensor.

This reference sensor cannot anyhow be a normal long reference sensor, as discussed earlier.

The solution is to use a dedicated reference sensor that is short enough so that it can be immersed into the same depth as the sanitary sensor to be calibrated. Optimally, the middle of the sensor elements should be aligned to the same depth.

Also, the reference sensor needs to have a thin flexible cable so that the cable can fit under the flange of the sanitary sensor. To help that, we can make a grove in the top of the insert where the reference sensor cable fits and the flange of the sanitary sensor still touches the top of the insert.

Naturally the structure of the temperature dry-block needs to be such that the sanitary sensor with the flange fits to its place and touches the insert top end (in some dry-blocks the surroundings are preventing the flange to go deep enough to touch the top of the insert).

 

PictureA dedicated short reference sensor is located at the same depth as the short sanitary sensor to be calibrated, ensuring they are measuring the exact same temperature. Also, the reference sensor cable is in the grove, so it does not prevent the flange of the sanitary sensor to touch the top of the insert:

Sanitary temperature sensor calibration - a Beamex blog post

 

 

Picture: Some example pictures of how the dedicated insert for sanitary sensor calibration could look like. The hole for the sanitary sensor and for the reference sensor are equally deep and there is a grove where the ref sensor cable can go:

Insert pictures

 

Short sensor without a clamp connection

There are also short temperature sensors without the clamp connection and without the flange. With these sensors you should use an external reference sensor that has been immersed to the same depth as the sensor to be calibrated. The reference sensor should be as similar as possible with the sensor to be calibrated (similar diameter, similar response time, etc.).

The internal sensor in the dry-block cannot be used here either since it is located in the bottom of the temperature block and does not measure the same temperature as the short sensor.

 

Picture: Calibrating a short sensor (without a flange) using a short reference sensor:

Sanitary temperature sensor calibration - a Beamex blog post

 

 

Download this article as a free pdf file by clicking the picture below:

Sanitary Temperature Sensor Calibration - Beamex blog post

 

Back to top 

Documentation, metrological traceability, calibration uncertainty

There are many additional things that are important in every calibration; as there are separate articles of many of these in this blog, I only briefly mention these here:

As documentation is included in the formal definition of calibration, it is a vital part of every calibration. This is naturally also valid in sanitary temperature sensor calibration. Typically, in the form of a calibration certificate.

The calibration equipment used should have a valid metrological traceability to the relevant standards, otherwise the calibration does not ensure traceability in the sensor calibration. More info on metrological traceability can be found here:

 

The calibration uncertainty is a vital part in every calibration. If the calibration equipment (and calibration method and process used) is not accurate enough for the sensor calibration, then the calibration does not make much sense. I mean, what’s the point to use a 2% accurate calibrator to calibrate a 1% accurate instrument.

 Learn more about calibration uncertainty here:

 

Back to top 

Beamex solution for short sanitary sensor calibration

Beamex MC6-T

 

The Beamex MC6-T is an extremely versatile portable automated temperature calibration system. It combines a temperature dry-block with Beamex MC6 multifunction process calibrator and communicator technology.

The Beamex MC6-T150 temperature calibrator model is perfectly suited for the application of calibrating this kind of short sanitary temperature sensors. The MC6-T150 can be provided with custom inserts to match your specific sensors.

The Beamex SIRT-155 temperature sensor is a very short and accurate temperature sensor with a thin flexible cable, designed to be a perfect companion with the MC6-T150 for this application.

Using the MC6-T in conjunction with Beamex calibration software, CMX or LOGiCAL, enables you to digitize and streamline your whole calibration process.

 

Pictures:  In the first picture below we can see Beamex MC6-T with a dedicated insert for sanitary sensors calibration. The second picture shows how the short reference sensor (SIRT-155) is being installed. Third picture shows the sanitary sensor to be calibrated being installed. Finally, the fourth picture shows all installations being done and we are to start the automatic calibration:

Calibrating sanitary sensor with Beamex MC6-T Calibrating sanitary sensor with Beamex MC6-T

 

Calibrating sanitary sensor with Beamex MC6-T Calibrating sanitary sensor with Beamex MC6-T

 

In case you want to learn more, or to see a demonstration how to calibrate sanitary temperature sensors with Beamex solution, please feel free to contact us.

Fill the Contact Request Form or find our worldwide contacts.

 

Back to top 

Related blog posts

If you found this article interesting, you might also be interested in the following articles and eBooks:

 

Feel free to ad comments or questions, share the article or suggest interesting topics for new blogs articles.

Thanks for taking the time to read!

Back to top 

 

Topics: Temperature calibration

Future calibration trends by calibration experts in the pharmaceutical industry

Posted by Heikki Laurila on May 07, 2020

Future calibration trends by calibration experts in the pharmaceutical industry

 

We regularly organize user group meetings for our pharmaceutical customers. During a recent meeting, we interviewed some of these pharmaceutical calibration experts on future calibration trends, and we wanted to share the result with you.

In this article, you can read what these calibration experts from the world’s top pharmaceutical companies think about calibration challenges, future trends and other calibration related topics.

The following people were kind enough to join the video interview:

  • Boehringer Ingelheim, Ingo Thorwest
  • Boehringer Ingelheim, Eric Künz
  • Boehringer Ingelheim, Alexander Grimm
  • GlaxoSmithKline, Don Brady
  • GlaxoSmithKline, Simon Shelley
  • Novartis, Kevin Croarkin
  • AstraZeneca, Tomas Wahlgren

In addition, written replies were given by delegates from Lonza, Astellas, AstraZeneca and GlaxoSmithKline.

 

The following questions were asked from all of the delegates:

  1. What are your biggest calibration challenges?
  2. How do you see your calibration changing in the next 5 years?
  3. Do you see any future technology changes that could affect the need to calibrate or how to perform your calibrations?
  4. Do you see the industry digitalization changing your calibration?

 

You can find a summary of the interviews in the below video and also written in the “transcription” section below.

We trust that this information is useful for you and you can learn from these comments. Please feel free to share your questions and comments in the comments section at the end.

 

Executive summary (1 minute read)

If you don't have time to read the whole article, here is a quick-read executive summary of the article discussions:

 

What are your biggest calibration challenges?

For pharmaceutical companies, the compliance to regulation is naturally a vital consideration.

The challenges that seem to repeat in many answers are the challenges with data integrity, i.e. to produce calibration data without any media breaks. There is a strive to remove paper-based systems for recording and approval in calibration solutions and to digitalize the whole calibration process.

Another repeating comment is the mobility, the security and data integrity with mobile devices.

Also, the implementation of a global standardized calibration solution across the multiple sites globally is considered a challenge.

 

How do you see your calibration changing in the next 5 years?

The most often repeating comment is to get rid of paper-based systems and to digitalize the calibration process.

Also, integration of calibration system to other systems (like maintenance management systems) seems to be a common comment.

The use of the calibration data in other systems, is something mentioned, as well as the strive for improved mobility.

 

Do you see any future technology changes that could affect the need to calibrate, or how to perform your calibrations?

When discussing the future technology, the comments included: cloud technology, automatic calibration, digitalization enabling paperless calibration, more productivity with more efficient calibration, using calibration data for analysis, integration of systems, increased mobility and naturally the effects of the Industry 4.0.

 

Do you see the industry digitalization changing your calibration?

Most delegates commented that they will definitely be going digital and are excited to do so.

Other comments include improved data analytics, increased mobility, better connectivity of systems, expecting digitalization to improve data integrity and the development of the DCC (digital calibration certificate) standard.

 

Video interviews

Below you can find the highlights of the of the video interviews:

 

 

Many of the world’s leading pharmaceutical and life sciences companies depend upon Beamex calibration solutions. Book a free consultation with our pharma calibration experts to find the best calibration solution for you.

Book a free consultation

 

 

Transcription of the video

Here you can find the transcription of the above video interviews:

 

1. What are your biggest calibration challenges?

 

Ingo Thorwest, Boehringer Ingelheim

I think the challenge in Pharmaceutical industry is all about data integrity. Producing data without any media break is of an absolute importance for us.

 

Don Brady, GlaxoSmithKline

I would say compliance data and mobility. Compliance, because all of our data that we capture has to be ALCOA - Attributable, Legible, Contemporaneous, Original and Accurate. That's something we worked with providers and Beamex with over the years to get us to that point. Mobility, it is not as easy as implementing a mobile solution, it has to be compliant, it has to be unchallengable, and that’s where we are at today.

 

Simon Shelley, GlaxoSmithKline

 I still think that data integrity is one of our biggest challenges that we are facing. The roll out of Beamex has certainly helped but we still get a number of issues at the sites that are slow to adopt to the solution. We need to look at the advantages we can get from using the technology to move us to paperless and hopefully reduce our data integrity issues.

 

Alexander Grimm and Eric Kuenz, Boeringer Ingelheim 

Alexander: So, at the moment, in the pharma company, we are facing very specific regulations regarding calibration, and I think that the main challenge we have at the moment is about documentation of calibration, and managing calibration data. 

Eric: From the IT point of view, we need to react on that requirement to find the right solution and to bring in more structured data input to the calibration management solution.

 

Kevin Croarkin, Novartis 

I think that until very recently and probably still now, is the ALCOA process and data integrity issues in general. 

The other major challenge that we have, is a lot of calibrations are being externalized. Meaning in the past, we would have had internal people, on site doing the calibrations, where now, we have companies that come in and do the calibrations for us. 

 

2. How do you see your calibration changing in the next 5 years?


Tomas Wahlgren, AstraZeneca

I think it is going to be more integrated with other systems, like the CMS and other types of applications, like laboratory systems or something like that. 

I also see that we must increase the speed of how we perform the calibrations, because it must go quicker and easier, but of course we still need to have the quality in it.

 

Don Brady, GlaxoSmithKline 

Integration. Integration of data; it is not just a matter of gathering calibration data and archiving it, it is now a matter of integrating it with data from the systems that we have taken the calibrations from, and using all of that to create a big picture of the machine we are actually working on.

 

Ingo Thorwest, Boehringer Ingelheim

In the next few years or so it will definitely change into digitalization. Avoiding media breaks, being more and more in partnership with contractors that come in.

 

Simon Shelley, GlaxoSmithKline

I am not sure that calibration itself will change that much, but I think the way that the data will be used will change tremendously. There will be a lot more data usage and therefore the liability of that data will go up.

 

Alexander Grimm and Eric Künz, Boeringer Ingelheim 

Alexander: My assumption is that the pure calibration process: how you handle a real instrument might not have that many changes. But again, talking about documentation, we really hope to get more digitalized, to get rid of paper, to have a completely lean and digital process in the future. 

Eric: That also means for me that for process execution, we bring in the right technology, mobile devices and improved data input media in place, which also means change to IT from an infrastructure point of view, because we have to make sure we have the right infrastructure in place like wireless solutions, Wi-Fi connection or maybe offline capabilities.

 

Kevin Croarkin, Novartis 

My vision would really be, first of all, that we are not using any paper. 

Taking that forward is where our external companies are coming in with their own tools, doing the job and literally just sending us a digital file transfer afterwards when they have the job completed. 

 

3. Do you see any future technology changes that could affect the need to calibrate or how to perform your calibrations?

 

Ingo Thorwest, Boehringer Ingelheim

All technology avoiding media breaks will become more and more important and will change our way of calibrating.  Developing technologies in these environments, in cloud technology, will definitely be one of the changes of the next years

 

Don Brady, GlaxoSmithKline

At the moment we are doing a lot on machine learning, predictive maintenance, which will hopefully lead to less calibration. We look at that we have calibrated this machine 10 times in the last 5 years and it has not failed, so now just let us calibrate it 5 times in the next 5 years. Machine learning has a big part to play in that as well, where we think it is going to automate the whole calibration scheduling and the whole calibration act; where a user can just step back, click a button and go, and it will work seamlessly always remaining compliant, and aligning to the ALCOA goals mentioned previously. That is how we see it changing.

 

Simon Shelley, GlaxoSmithKline

First of all, we are moving into more paperless integrated solution, so, as our technicians go more paperless for all our activities, calibration will be just one of those routines that is also paperless. 

So, I think we will use the data more increasingly for diagnostics and to try increase productivity and just make the operators life more simple by giving them more data available in their hands. I think mobile technology is going to be a real driver for that.

 

Alexander Grimm, Boeringer Ingelheim

By the hope of a higher grade of digitalization, we hope that we have significant improvement in regards of data integrity, but on the other hand also savings and efficiency. 

 

Kevin Croarkin, Novartis

Obviously, at the moment predictive maintenance and digital engineering are the buzzwords in our industry.

I think sometimes people forget that there is a commonly used maintenance pyramid where at the very bottom it is reactive, and then you work your way up where you do preventive maintenance, you’re doing condition-based maintenance, predictive and the proactive. So, it’s really encompassing that whole triangle. 

I think the other side to it is that we are moving toward a more digital and mobile workforce as well.

Obviously with the latest version of CMX and bMobile, we now have a situation where our technicians are able to go into the field, particularly into the hazardous areas where there is no Wi-Fi, with a tablet that they can hold in one hand, which is a massive improvement from the past when they needed a backpack to carry on the front of them. 

 

4. Do you see the industry digitalization changing your calibration work?

 

Ingo Thorwest, Boehringer Ingelheim 

We will go digital, definitely. We see our company already going this way, we see other companies doing this already. It is not only for calibration, it is in all feeds of processing data. So, being on paper in 5 years, nobody will talk about that anymore.

 

Tomas Wahlgren, AstraZeneca 

More data analytics, we are collecting a lot of data and we use the data for planning for the future. If we can use a big amount of data from calibration, we can prepare and say that we do not need to calibrate so often, or we can change the way of calibration. 

 

Don Brady, GlaxoSmithKline

In the 3 to 5 year period. I think it is just about making it easier for the technicians to do calibrations, and again with mobility. 

You need the data that we are historically collecting, so that we can analyze where machine learning fits best.

That is the biggest change that is coming and obviously internet 4.0. All of those things will play and apparent part where everything is connected, and everything is integrated. 

 

Simon Shelley, GlaxoSmithKline

The digital revolution is exciting and lots of people are investing in it heavily. I think in the area that is going to amplify is the vulnerability of the OT space, operational technology. 

With cybercrime going up, we are seeing a number of suppliers more integrated to themselves being victims themselves and that's having a secondary impact on us. 

I think we will see a growth in that interconnectivity between companies as they offer services to each other, but I think we are also going to see an increased focus on the OT cyber security.

 

Other relevant blog posts

If you found this article interesting, you could also like these blog posts:

 

 

Many of the world’s leading pharmaceutical and life sciences companies depend upon Beamex calibration solutions. Book a free consultation with our pharma calibration experts to find the best calibration solution for you.

Book a free consultation

 

Beamex’s solution for pharmaceutical industry

The world’s leading Pharmaceutical and Life Science companies depend upon Beamex calibration solutions. Our solutions have been developed for over 40 years by combining our own experience and feedback gained through close partnerships with our customers to help them achieve their calibration related goals of compliance to regulation and productivity gains.

Many of these customers have selected Beamex CMX calibration softwarecalibrators and comprehensive launch and support services for their enterprise-wide solution. Calibration data has been connected and shared with their globally deployed ERP, maintenance system or instrument and asset management system, to achieve end-to-end paperless calibration, streamlined processes and truly mobile working, while maximizing the integrity of the data to achieve the highest levels of compliance.

The Beamex calibration solution fulfills the requirements of 21 CFR Part 11 and other relevant regulations for electronic records, electronic signatures and data integrity. The CMX calibration software version 2.11 introduced the “Mobile Security Plus” feature, which offers enhanced functionality and is compatible with offline mobile devices, such as the Beamex MC6 family of documenting calibrators and tablets/smartphones with the Beamex bMobile calibration application. This enhancement further lowers the risk of ALCOA violations by identifying those using offline mobile devices by their electronic signature and prevents offline data tampering.

We offer tools for the mobile worker including calibration references for pressure, temperature and electrical signals, the Beamex MC6 family of portable documenting calibrators and the Beamex bMobile application for tablet-based data entry for use in clean rooms to the most arduous of industrial environments with the ability to document and sign calibration results in the field. Beamex mobile security plus technology ensures these portable devices deliver the highest levels of data integrity, well in line with regulations such as FDA and MHRA.

 

Please contact us to discuss how we can help you!

 

 

Topics: Calibration, Calibration in pharmaceutical industry

Pressure Switch Calibration

Posted by Heikki Laurila on Mar 30, 2020

banner_Pressure-switch-calibration_1500px_v1

Pressure switches are very common instruments in the process industry, and various kinds of pressure switches are available. Like many instruments, pressure switches need to be calibrated to ensure their accuracy and reliability. Switches are a bit more difficult to calibrate than transmitters. The wrong kind of calibration can cause many errors in the calibration result. In this article, we will look at how to properly calibrate pressure switches.

Before rushing into the calibration process, let's discuss some fundamental characteristics and terminology of pressure switches.

Download this article now!

 

How does a pressure switch work?

Briefly stated, a pressure switch is an instrument that measures pressure and that has an electrical switch function programmed to operate at a certain pressure.

For example, it can be set so that when no pressure is connected (open to atmosphere) the switch is closed, but when pressure increases up to 10 psi, the switch opens. Again, when the pressure drops below 10 psi, the switch closes.

 

Pressure switch terminology

Let’s first very briefly discuss the related terminology;

 

Normally open / Normally closed

Some switches have the switch terminals open when no pressure is connected, called normally-open (NO) or a closing switch. The opposite is normally-closed (NC) or opening switch. The selection depends what kind do of circuit you want to drive with the switch.

What is "normally"? There is some debate about the definition of the normally open/closed switch. Most commonly it is defined as the state where the pressure switch output is when it's not connected to any pressure, i.e. it has no physical stimulation. 

Others may define the “normal” state as the state where the switch is during the normal operation of the process (un-tripped).

Pressure-switch_normally-close-and-normally-open_1500px_v2

 

A normally-open switch is open when no pressure is connected. When enough pressure is applied, the switch closes:

Pressure switch calibration - Normally-Open switch - Beamex blog post

 

A normally-closed switch is closed when no pressure is connected. When enough pressure is applied, the switch opens:

Pressure switch calibration - Normally-Closed switch - Beamex blog post

 

A switch will always have some deadband, which is the difference between the two operating points (opening and closing points). Deadband is required, because if a switch would open and close at the same point, it could start oscillating when the pressure is on that limit. Also, it could control the circuit on and off with a high frequency if there was no deadband. For example, a closing (NO) pressure switch may close at 10 psi pressure and open again at 9.5 psi pressure, so there is a 0.5 psi deadband.

Some switches operate at rising pressure, others with falling pressure. Sure, you always get one of the functions with rising and other with falling, but the primary desired function happens in one direction.

There are pressure switches that operate with different pressure types: gaugeabsolutedifferential or vacuum pressure.

Some older switches are mechanical (or even pneumatic), so inside the switch the pressure is causing the switch to change its state. Most newer types are electronic or digital, so they measure the pressure and control the switch output accordingly. Many modern switches are programmable, so it is easy to set the desired operating points. While mechanical switches don’t need any power supply, the electrical ones need to have one.

When selecting the switch type, the state should be considered so that should the power supply fail, or a cable becomes loose, the switch status should remain safe. And in the case of a safety switch, it should be configured so that in case a cable comes loose, the alarm goes on. For example, if it is a normally-open (closing switch), you won't notice anything if the cable comes loose, the switch is still open, but it won't make the desired action when the switch closes. So all in all, you should design it to be Fail Safe.

We also talk about dry and wet switches. A dry switch has the connections being open or closed, so it is working like a mechanical switch. A wet switch has two different voltage values representing the two output states.

The output of an electrical wet switch can be a voltage signal with two levels, a current signal, or an open collector type signal.  

Sometimes the switch function can be also done in the control system, by measuring the current signal from a transmitter and programming the switch-like function to control something based on the signal level.

In practice, industrial switches often have double switch contacts that can be programmed separately. This can be the normal Lo and Hi points, but also “Lo Lo” and “Hi Hi” points. While the Lo and Hi are the normal control points, the Lo Lo and Hi Hi are alarm limits that will control for more serious alarm activities.

 

Safety pressure switches 

Safety switches are switches used in the safety instrumented systems (SIS), and these switches have certain safety classifications. Also, the calibration of these safety switches is regulated.

A big difference with these switches is that these switches stay static most of the time without ever functioning. So, they don’t toggle open and closed in normal usage, they are just waiting if the safety alarm level is met, and then they operate.

As these switches very rarely operate, there is a risk that they will get stuck and not work when they should.

When calibrating, do not exercise these safety switches prior to calibration, instead capture the very first point when the switch operates. It can happen that the first operation requires more pressure than the operations after a few exercises.

Normal switches are typically exercised a few times before calibration, but that should not be done for the safety switches.

In a safety switch, the operation point is critical, but often the return point is not that relevant and may not even require to be calibrated.

 

How to calibrate pressure switches

Now, let’s (finally!) discuss how to calibrate pressure switches.

 

Preparations & safety

If the switch is installed in the process, it is very important to make sure it is isolated from the pressure line. You also need to make sure to disconnect any circuit that the switch is controlling - you don’t want big valves to start opening/closing, or pumps to start operating, nor generate a safety alarm.

Some switches may have mains voltage, or another dangerous voltage, across the switch terminals when they open, so make sure that it is isolated.

 

Pressure ramp

To calibrate a pressure switch you need to provide a slowly changing pressure ramp, moving across the operating points of the switch. Depending of the switch type, you need to first supply a suitable pressure to start the calibration.

Often you can start from atmospheric pressure, but in some cases, you need to pump a high pressure and start slowly decreasing the pressure towards the operation point. Or you may need to provide a vacuum to start from. This depends on the switch to be calibrated.

There are different ways to provide the input pressure. You can use a calibration hand pump with a fine adjustment control, you may use shop air supply with a precise pressure controller, or you can use an automatic pressure controller.

It is vital to provide a slow pressure ramp so that you can see the precise pressure whereby the switch operated. If the pressure changes too quickly, you cannot accurately capture the pressure point when the switch operated.

Certainly, some tools (like the Beamex MC6) can automatically capture the exact pressure during the very moment when the switch changed its status.

Anyhow, remember to change the pressure very slowly when you are approaching the operation points of the switch! You may change the pressure faster when you are not yet close to the operation points.

 

Measuring the switch output

You need some tool to measure the switch terminals. If it is a dry switch, with an open and close output, you may use an Ohm meter. If the output is electrical, you will need to find a tool that can measure the output. In some cases, it may be a voltage meter, or current meter. For electrical outputs, it is sometimes a bit difficult to find how to measure the output. You should anyhow be able to recognize the two states of the output and to see when the state changes.

With some tools, you can program a trigger level that suits the switch in question which enables the status change to be captured automatically. This is how the Beamex MC6 works.

 

Capturing the operation points

In the switch calibration, you need to capture the input pressure at the very moment when the output state changes.

You can try to capture the input pressure manually, e.g. when the switch state changes, you stop the ramp and look what is the input pressure (on the device/calibrator that is measuring the input pressure). Most likely there is some delay in your reflexes, so the pressure is already different than what it was during the switch operation moment. That is the main reason you should provide a very slow input pressure, so it has not changed that much during the delay of your reflexes.

Some devices can capture the input pressure automatically at the very same moment when the switch output changes its state. Needless to say, the Beamex MC6 family of calibrators can do that… :-) 

The MC6 can interpolate between the pressure measurement readings. Let me explain; a digital pressure measurement device measures the pressure a few times every second. It may happen that the switch operates in between the two consecutive pressure measurement readings. In that case, the MC6 looks at the time stamp of the switch operation and interpolates between the two consecutive pressure measurement results to get the exact pressure value during the switch operation moment.

 

Delayed output

Some industrial switches may have a delay added to the output so that it does not work too quickly. You should find out if your switch has delay as then the calibration needs to be done even slower than normally.

With some added delay, by the time the output toggles, the input pressure is already far away from the point that actually triggered the output to toggle.

 

Steps in pressure switch calibration:

Here’s a condensed list of steps in pressure switch calibration:

  1. Depressurize & disconnect for safety.
  2. Connect the pressure source and the pressure calibrator to the switch input.
  3. Connect the device to measure the switch output status.
  4. Exercise the switch a few times - pump full pressure and back to zero. Not with safety switches!
  5. Pump normally pressure close to operation point.
  6. Move pressure very slowly across the operation point, until the switch output toggles. Record the operation pressure.
  7. Move pressure very slowly towards the return point, until the switch status toggles. Record the return pressure.
  8. Make required number of repeats - repeat the two previous steps.
  9. Vent pressure.
  10. Disconnect the test equipment.
  11. Return switch back to service.

 

Naturally, you need to document the switch calibration results.

Also, you need to calculate the errors found in the calibration and compare that to the max allowed tolerance for that switch to see if it Passed or Failed calibration. In the case of the switch failed the calibration, then you need to either adjust the switch or replace it. Even if it passes the calibration, you should still analyze how big the error was. If the error was close to the tolerance limit, or if it had drifted much since last calibration, it is good to adjust it to avoid a fail result in the next calibration.

And as with every calibration, based on the calibration result history, you should consider if the calibration period should be changed. You don’t want to waste resources on calibrating it too often, but also you don’t want to calibrate it so seldom that you get a failed calibration result. A failed calibration result should anyhow always start an investigation of the consequences. This can be expensive and work intensive.

 

More discussions on how often instruments should be calibrated can be found in this blog post:

 

And discussions on Fail and Pass calibration can be found here:

 

Documentation, metrological traceability, calibration uncertainty

As documentation is included in the formal definition of calibration, it is a vital part of every calibration. This is also valid in pressure switch calibration. Typically, in the form of a calibration certificate.

The calibration equipment used should have a valid metrological traceability to the relevant standards, otherwise the calibration does not ensure traceability in the switch calibration. More info on metrological traceability can be found here:

The calibration uncertainty is a vital part in every calibration. If the calibration equipment (and calibration method and process used) is not accurate enough for the pressure switch calibration, then the calibration does not make much sense. I mean, what’s the point to use a 2% accurate calibrator to calibrate a 1% accurate instrument.

 

Learn more about calibration uncertainty here:

 

We also have one older blog post that includes a short video on pressure switch calibration here:

 

Download this article

Click the below picture to download this article as a free pdf file:

Pressure switch calibration - Beamex blog post

 

 

Pressure Calibration eLearning

Free eLearning course on industrial pressure calibration.

Master pressure calibration with this free comprehensive eLearning course from Beamex. Deepen your knowledge, pass the quiz, and earn your certificate!

Read more and enroll >

 

 

Beamex solution for pressure switch calibration

As you would guess, Beamex offers solutions for pressure switch calibration. 

Our MC6 family of calibrators can perform documented pressure switch calibrations, either semi-automatically with a calibration pump, or fully automatically with a pressure controller.

You can upload the pressure switch calibration results from calibrator to calibration management software for paperless documentation. 

Please contact us learn more:

Contact us

 

 

Topics: Pressure calibration, Pressure switch

Temperature Calibration [eBook]

Posted by Heikki Laurila on Feb 19, 2020

Beamex calibration essentials - temperature ebook

 

In this blog post we want to share with you an educational eBook focusing on temperature calibration and other temperature related topics.

Some of these articles have been already earlier posted in Beamex blog, but now several temperature related resources have been collected into one free handy eBook.

Just give me the free eBook now! >>

 

Contents of the eBook

The eBook contains following articles:

  • Uncertainty components of a temperature calibration using a dry block  (Page 4) 
  • Pt100 temperature sensor — useful things to know (Page 13) 
  • Thermocouple Cold (Reference) Junction Compensation (Page 21)
  • Temperature units and temperature unit conversion (Page 27)
  • How to calibrate temperature sensors (Page 31)
  • AMS2750E heat treatment standard and calibration (Page 37)
  • Optimal testing parameters for process instrument calibration (Page 45)

 

Download the free temperature eBook here!

 

Abstracts of the articles

Here are short abstracts of each article included in the eBook:

Uncertainty components of a temperature calibration using a dry block

Uncertainty components of a temperature calibration using a dry blockIn this article, we will be covering the different uncertainty components that you should consider when you make a temperature calibration using a temperature dry block.

Making a temperature calibration using a dry block seems like a pretty simple and straight forward thing to do, however there are many possible sources for uncertainty and error that should be considered.

Often the biggest uncertainties may come from the procedure on how the calibration is done, not necessarily from the specifications of the components.

 

Pt100 temperature sensor — useful things to know

Pt100 temperature sensor — useful things to knowPt100 temperature sensors are very common sensors in the process industry. This article discusses many useful and practical things to know about the Pt100 sensors. There’s information on RTD and PRT sensors, different Pt100 mechanical structures, temperature-resistance relationship, temperature coefficients, accuracy classes and on many more.

 

Thermocouple Cold (Reference) Junction Compensation

Thermocouple Cold (Reference) Junction CompensationEven people who work a lot with thermocouples don’t always realize how the thermocouples, and especially the cold (reference) junction, works and therefore they can make errors in measurement and calibration.

In this article, we will take a short look at the thermocouple cold junction and the cold junction compensation. To be able to discuss the cold junction, we need to take first a short look into the thermocouple theory and how a thermocouple works.

We won’t go very deep in the theoretical science but will stick more with practical considerations, the kind of things you should know when you work with thermocouple measurements and calibrations in a typical process plant.

 

Temperature units and temperature unit conversion

Temperature units and temperature unit conversionThis article discusses temperature, temperature scales, temperature units and temperature unit conversions. Let’s first take a short look at what temperature really is, then look at some of the most common temperature units and finally the conversions between them.

 

How to calibrate temperature sensors

How to calibrate temperature sensorsEvery temperature measurement loop has a temperature sensor as the first component in the loop. So, it all starts with a temperature sensor. The temperature sensor plays a vital role in the accuracy of the whole temperature measurement loop.

As any measurement instrument you want to be accurate, also the temperature sensor needs to be calibrated regularly. Why would you measure temperature, if you don’t care about the accuracy?

In this article, we will take a look at how to calibrate temperature sensors and what are the most common things you should consider when calibrating temperature sensors.

 

AMS2750E heat treatment standard and calibration

AMS2750E heat treatment standard and calibrationIn this article, we will take a look at the AMS2750E standard, with a special focus on the requirements set for accuracy, calibration and test/calibration equipment.

The AMS2750E is predominantly designed for heat treatment in the aerospace industries. Heat treatment is an essential process for many critical parts of an airplane, so it is understandable that there are tight regulations and audit processes set.

While the results and success of some other industrial processes can be relatively easily measured after the process, this is not the case in a heat treatment process. Therefore, very tight control and documentation of the heat treatment process is essential to assure the quality of the end products.

 

Optimal testing parameters for process instrument calibration

Optimal testing parameters for process instrument calibrationMost calibration technicians follow long-established procedures at their facility that have not evolved with instrumentation technology. Years ago, maintaining a performance specification of ±1% of span was difficult, but today’s instrumentation can easily exceed that level on an annual basis. In some instances, technicians are using old test equipment that does not meet new technology specifications.

This paper focuses on establishing base line performance testing where analysis of testing parameters (mainly tolerances, intervals and test point schemes) can be analyzed and adjusted to meet optimal performance. Risk considerations will also be discussed—regulatory, safety, quality, efficiency, downtime and other critical parameters.

A good understanding of these variables will help in making the best decisions on how to calibrate plant process instrumentation and how to improve outdated practices.

 

Download the free temperature eBook here!

 

New temperature calibrator Beamex MC6-T

If you work with temperature calibration, please check out our latest temperature calibrator Beamex MC6-T.

Click the below picture to learn more:

Beamex MC6-T temperature calibrator

 

Links to the individual blog articles

Here are links to the individual blog articles:

 

 

Topics: Temperature calibration

Calibration Trends Featuring Automation & Digitalization [Webinar]

Posted by Heikki Laurila on Jan 23, 2020

Calibration Trends Featuring Automation & Digitalization - Beamex blog post

In this blog post, I am proud to share a recent webinar collaboration with ISA (International Society of Automation) titled "Calibration Trends Featuring Automation & Digitalization."

Calibration automation and digital data capture have long been trends, but effectively combining these approaches to generate the most benefits has recently become a best practice in many process plants. 

Watch this webinar to learn how advanced technology gives you the ability to digitalize your calibration data and standardize your calibration processes to achieve benefits such as confidence in your data integrity, improved plant reliability and increased efficiency.

Check out the below table of contents and jump to what is interesting for you!

Click here to watch the webinar now >>

 

Table of contents:

0:00 (min:sec)

  • Introduction to the webinar

1:10             

  • Introduction of the presenters

3:45             

  • A brief history of calibration automation

4:45             

  • Presentation of the agenda

6:00             

  • Emergency of Digitalization in Calibration Processes
  • Terminology
  • Where do we need digitalization?
  • Industry 4.0
  • Change of Production Process
  • Digital Twins
  • Automated and Predictive Maintenance

18:40           

  • Digitalization in Calibration Workflow
  • Calibration: Paper vs. Digital
  • Calibration workflow – integrated system

22:35           

  • Demo – paperless calibration of a temperature transmitter

31:05           

  • Questions & Answers

38:50           

  • DCC – Digital Calibration certificate
  • Digital Infrastructure for Calibration Process

47:00           

  • Calibration KPI’s

52:40           

  • Demo – paperless calibration of a pressure transmitter

59:10           

  • Conclusions

1:03:05        

  • Questions & Answers

 

Watch the webinar

Watch the webinar now by clicking the picture below:

New Call-to-action

 

Want to learn more more? 

Check out these related resources:

 

Beamex solution for Automation & Digitalization

We offer an Integrated Calibration Solution that is the combination of softwarecalibration hardware and calibration expertise that delivers an automated and paperless/digitalized flow of calibration data.

Please visit our web site to learn more on our Integrated Calibration Solution.

 

 

Topics: Webinar, Digitalization

Pressure Transmitter Accuracy Specifications – the small print

Posted by Heikki Laurila on Dec 03, 2019

Pressure transmitter accuracy specifications - Beamex blog post

 

Pressure transmitters are widely used in the process industry. The advertised accuracy specification of modern pressure transmitters has become more and more accurate.

Often the advertised accuracy specification includes anyhow only part of the truth. It includes only some of the accuracy components effecting the total accuracy that you can expect from the transmitter in practice in your application.

In this blog post, I will examine some popular pressure transmitters’ accuracy specifications and the different accuracy components, such as effect of: re-ranging, ambient temperature, mounting position, static pressure, long term drift, vibration, power supply and more.

I will shortly explain what these components are and what they mean with a few examples.

Background

We see “number games” being played with some transmitters’ specifications, where they advertise an accuracy number that is just part of the truth, i.e. it is just one of the many accuracy components that you should take into account. In some cases, these advertisements can be confusing and give the wrong impression of the total practical accuracy you will get in your application.

Maybe the competition and race for the best accuracy numbers have led to this situation, that some manufacturers make a “limited” accuracy figure and put that on the cover of the brochure and advertise that on their web site, while the full specifications are found in the user manual.

Typically, a pressure transmitter’s specifications include several accuracy components that you should take into account when considering the total accuracy.

As mentioned, this blog post will review some popular pressure transmitter’s specifications to give you an idea of the kind of important factors you should take into account and be aware of. Also, I will list some typical specification numbers for the different partial accuracy components. I am by no means trying to put down or depreciate any transmitter.

As the transmitter accuracy affects the accuracy of your calibration equipment, we do also get these accuracy questions from customers. Certainly, the calibrator should be more accurate than the transmitter you calibrate with it, but the accuracy ratio between these two is something different people have different opinions on. Anyhow, you should be aware of the total uncertainty of the calibration and document that during the calibration.

The selection of your process transmitter’s tolerance should be anyhow based on the process requirements, not on the specifications of the transmitter that is installed in that location.

Time to dive into it…

 

Pressure transmitter accuracy components

 

“Reference accuracy”

Often there is a separate “limited” accuracy statement mentioned, typically on the cover of the brochure, or on the website.

This can be called “reference accuracy” or something similar, that includes only some parts of the accuracy, not all parts. It includes, for example only linearity, hysteresis and repeatability.

This “best-case accuracy” does not include all the practical accuracy components you should consider (mounting position, ambient temperature, etc.). So, don’t think that this specification is what you can expect in practice from the transmitter when you install it in your process.

This “best-case accuracy” may be for example 0.04 % or even 0.025 % of range, for the most accurate pressure ranges for the most accurate transmitters.

 

Different pressure ranges

Often the best (reference) accuracy is valid only for certain pressure ranges, not for all the ranges available. Also, it may vary on the pressure type, i.e. an absolute range may be different than a gauge range.

While the best ranges can have, say even a 0.04 % of range accuracy, some other range of that same transmitter model may have, for example, a 0.1 % accuracy.

Accuracy specifications may be doubled or tripled for the different pressure ranges available.  So, make sure you know what the accuracy is for the exact pressure ranges/models that you are using.

 

Re-ranging

HART (smart) transmitters can be re-ranged with a wide ratio. Often you can re-range a transmitter with a turndown ratio of 100:1 or even more. Accuracy specifications are commonly given to the full range, or with a limited turndown ratio.

If the HART transmitter (with a mA output) is re-ranged for a smaller range than the full range, that typically worsens the accuracy.  So, if you re-range your transmitter to a smaller range than the max range, please make sure you find if / out how much error that adds to the accuracy.

 

Ambient temperature effect

Most pressure transmitters are used in varying environmental conditions in the processes. Also, the temperature of the pressure media may vary widely during usage.

As with most measurement devices, pressure transmitters typically have some kind of temperature coefficient, i.e. there is an accuracy component that depends on the environmental temperature.

The temperature dependency seems to be often specified in a pretty difficult to understand format. But try to understand that and ask the supplier if you can’t figure that out.

Anyhow, looking at different transmitters, this may vary from say 0.01 % of range even up to 0.5 % of range. The worst models seem to specify the temperature effect being more than 1 % of the range.

If the temperature in your process varies a lot, you should take this into account.

 

Static (line) pressure effect

Differential pressure transmitters can be used under static line pressure conditions. This means that both inputs have a certain pressure and the transmitter is measuring the difference between the two inputs. Compared to a gauge transmitter that is measuring pressure against the atmospheric pressure or an absolute transmitter that measures pressure against full vacuum.

An ideal differential transmitter would measure only the difference between the inputs, but in practice, the common-mode static line pressure has some effect on the output.

If you have both inputs open to atmospheric pressure, the differential pressure is naturally zero. Also, if you have the same pressure (say 50 bar/psi) applied to both inputs the differential pressure is still zero. In practice, that static pressure has some effect to the transmitter output. So, the output changes a little when the line pressure changes.

Typically, the line pressure effect can go from 0.025 % of range up to 0.4 % of range, depending on the transmitter model.

Commonly, the line pressure changes mainly the zero of the transmitter, but does not make a significant change to the span. So, in calibration, you can test this effect by applying the same pressure (a low pressure and a high pressure) to both inputs and see how much the zero changes.

Line pressure may also have some effect to the span of the transmitter, which makes it far more difficult to handle and to calibrate. It requires a differential pressure standard for the calibration.

 

Long term stability

All measurement devices will slowly lose their accuracy over time. Some more, some less. That goes also for the pressure transmitters.

Some pressure transmitters have a 1-year stability specified, some have even a 5- or 10-year specification, or even longer.

For example, a transmitter that has a reference accuracy of 0.04% of range can have 1-year stability of 0.2% of range. Some other models have a similar 0.2 % of range level of specification valid for 5 or even 10 years.

The best one I found was as low as 0.01 % of range as a 1-year stability.

Depending on how often you re-calibrate your pressure transmitters, you should consider the long-term stability effect, as the transmitter may drift that much before the next recalibration (and possible trim).

 

Mounting position (orientation) effect

The mounting position typically has some effect on the accuracy of the pressure transmitter. Most pressure transmitters have a specification for the mounting position.

Typically, a change in the orientation changes the zero and does not affect the span accuracy. In practice, the orientation of the transmitter does not change during normal usage. The orientation should anyhow be considered if you first calibrate the transmitter in a workshop and then install it to the process, or if you remove the transmitter from the process for recalibration.

Certainly, if a transmitter has a remote seal, the location of the capillary tubes will have a big effect on zero value. Again, this is not something that does not change during normal usage, but may affect the calibration, if the transmitter is removed from its install location.

 

Vibration effect

Many pressure transmitters have a specification for the effect of vibration.

Naturally, this needs to be considered only if the transmitter is installed in a vibrating location.

The vibration effect to accuracy is often relatively small and can be, for example, specified of being “less than 0.1% of range.”

 

Power supply effect

A 2-wire transmitter needs an external power supply to work. Typically, the power supply is a 24 VDC supply.

Transmitters can commonly work on a wide supply voltage range, going even down to 10 VDC.

Anyhow, if the supply voltage changes during the operation, that can have a small effect on the accuracy of the transmitter. The effect of the power supply voltage is typically small and can be specified of being “smaller than 0.01 % of span per 1 Volt change,” for instance.  

In practice, if you have a normal good power supply, this is not an issue.

 

Total accuracy specification

Some transmitters have some kind of “total accuracy” specification that includes several of the common accuracy components. This can include the earlier mentioned “reference accuracy” and the ambient temperature effect and static/line pressure effect. This kind of total accuracy has a more user-friendly value as it is gets closer to the real accuracy you can expect from a transmitter.

As an example, the “total accuracy” specification can be 0.14 % of range, while the reference is 0.04 %.

So as soon as you include the temperature and line pressure effects, the reference accuracy gets multiplied by a factor of 3 to 4.

Another example model offers a 0.075 % of range reference accuracy, and when the temperature effect is included it raises to 0.2 %, and when static pressure effects are also included it goes up to 0.3 % of range.

If the transmitter has this kind of “total” accuracy specification, it helps you to get a more realistic picture of what kind of accuracy you can expect in practice. Even though that “total” accuracy is often still missing some accuracy components are listed here.

 

Contamination in usage

When a pressure transmitter is used in a process to measure pressure, there is a big risk that the transmitter’s membrane gets contaminated by the pressure media or some dirt. This kind of contamination can have a huge effect on the transmitter’s accuracy.

This is, of course, not something that can be specified, but is anyhow a big risk in normal use. Especially, if you decide to have a very long recalibration period, such as several years. So, in addition to the transmitter’s long-term drift specification, this should be considered in the risk analysis.

If the transmitter gets very dirty and starts to measure significantly wrong, you will normally see that in the measurement results. But if it only starts to measure a slightly wrong, it is difficult to notice in normal usage.

 

Best-case and worst-case examples

When you add up all the above listed different accuracy specifications, you come to the real total accuracy specification you can expect in practice.

Generally, when you combine independent uncertainty components, the common rule is to use the “root sum of the squares” (RSS) method. Just adding all components together as a straight sum would be a worst-case scenario and statistically, in practice, it is not very likely that all components will be in the same direction at the same time. Therefore, this statistical RSS method is used.

To get a best-case summary, we should take all the smallest accuracy components and neglect the ones that may not be relevant.

For the worst-case scenario, we should take all the accuracy components as their max and assume they are all present.

 

Best-case accuracy

To get the best-case accuracy, the following assumptions were used:

  • Pick the best reference accuracy
  • Choose the most accurate model and range
  • Don’t do any re-ranging -> no effect to the accuracy
  • Use the transmitter in a limited temperature range, close to ambient temperature. Pick the smallest available temperature effect.
  • Assume no static/line pressure effect (used for gauge measurement) -> no effect.
  • Assume no vibration effect -> no effect
  • Assume a good power supply -> no effect
  • Include a one-year drift

After reviewing the specifications for several different transmitters, it seems that the smallest combined accuracy I can find takes me down to around 0.15 % of range. For most other models it seems that the best case is around double that, so around 0.3 % of range at best.

There are also many models that have bigger best-case accuracy.

 

Worst-case accuracy

To find the worst-case accuracy, the following assumptions were used:

  • Pick a model/pressure range with the biggest accuracy spec
  • Assume some re-ranging happening
  • Use the range with bigger temperature effect
  • Assume static/line pressure being used
  • Assume a small vibration effect
  • Assume a small power supply effect
  • Include a one-year drift

Again, looking at the different specifications, it seems that adding these worst-case accuracy specifications we end up somewhere around 1% to 1.5 % of range accuracy, with the most accurate transmitters.

But this figure can also go higher with some models.

 

Summary

As mentioned earlier, modern pressure transmitters are very accurate instruments. It is anyhow good to read the accuracy specifications carefully including all the different components that effect accuracy. It is easy to miss these and just look at the one accuracy, for example, “reference accuracy,” that is shown in marketing and other materials.

The purpose of this post is to raise your awareness on the different things that has an effect to the total accuracy you can expect in practice.

Of course, this same goes for all measurement equipment, not only for pressure transmitters. It is always good to read all specifications, including all the footnotes with small print.

I hope you found this article useful.

 

Beamex solution for pressure calibration

Beamex offers different solutions for pressure calibration, including calibrating pressure transmitters.

Please check out our offering here: Pressure Calibrators.

 

Related blog posts

If you found this article interesting, you might want to check these posts too:

 

 

Topics: Pressure calibration, Transmitter, calibration uncertainty

How calibration improves plant sustainability

Posted by Heikki Laurila on Oct 07, 2019

How calibration improves plant sustainability - Beamex blog post

You only live once: A common phrase used around the world to indicate how one should live their life to the fullest. This is a great concept for individuals to take advantage of the life they have been given, but to assure life for the future, resources and the environment should be taken into consideration in the process. In 1969, sustainability was introduced with the passage of the National Environmental Policy Act (NEPA), and has been an important topic ever since. Sustainable Plant Magazine defines sustainability as, “Operating our business in ways that meet the needs of the present without compromising the world we leave to the future.”

Social, economic, and environmental effects are the three pillars often used to define and measure sustainability.  Calibration plays a critical role impacting these pillars to help maintain sustainability throughout plants. Calibrating process instruments on a regular basis aids in optimizing processes to minimize production downtime, ensure product quality and secure plant reliability. For many plants, calibration is also a critical activity in controlling emissions, as emission-related instruments are often associated with the plant’s license to operate.

The pillars of sustainability

Although social effects are hard to quantify and measure, they still play an important role in maintaining sustainability. Safety is one social factor that tends to be more quantifiable than others: Evident across many industries, plants often display the number of days without injury. Employee safety is a social factor that is every plant's responsibility.

A plant’s overall health and performance is important in protecting  not  only  employees,  but  the  community  too. The community may not be directly impacted by on-the-job injury; however, poor maintenance and operations can lead to harmful impacts on the community, such as toxic gas emissions, out-of-spec products, or worst case scenarios, such as an explosion or poor product quality which leads to an injury or death.

Another social factor is the working and living conditions of the employees and community. Working conditions could include, working hours, industrial noise, plant temperature, and harmful toxin release. In some cases, employees are required to live where they work. An oil platform is a good example where social sustainability becomes even more important. Social sustainability is important in maintaining industrial performance for the future.

Economic sustainability in plant operations includes using available resources to increase performance with positive returns on investment and overall plant profit. Economic impacts are typically measured monetarily. If the return on investment is desirable, the plant can consider the resources justifiable. For example, if a calibration software solution helps monitor the overall instrument  health  of  a plant  which prevents  unplanned  shutdowns (that could cost millions of dollars), it is considered an economic solution to help maintain sustainability.

If available resources are not being used, the plant may not be sustainable in the future especially in competitive markets. In many of those situations, plant personnel do not understand what types of sustainable solutions are available and if they are right for a particular situation.  Fortunately,  many  solution  providers  offer sustainability and return on investment reports to help distinguish sustainable solutions.

Although economic sustainability involves increasing plant profit by using available resources, the environment should not be  compromised  in  the  process.  For  example,  if  cheaper raw materials exist which improve overall profit but create harmful and toxic waste that compromise the environment, that solution is not considered sustainable. Environmental conditions must be considered in sustainable solutions.

Sustainability  initiatives,  regardless  of  the  positive  impacts on the social and economic pillars, all depend on the impact made on the environment, because ultimately, the future depends on today’s efforts to maintain a livable environment.  Environmental  initiatives  could  include many  different  projects  to  decrease  negative  effects  on  natural resources available today. One such project is developing paperless environments or digitalizing data to not only maintain trees and reduce waste, but also to create a more economical solution that decreases time spent performing work. Other projects include the design and construction of green buildings that use less energy and water, manufacturing process modification that reduce greenhouse gas emissions that destroy the atmosphere, and restoration of different aspects of the environment that have been destroyed in the past, such as greenery and natural streams and rivers.

Different governmental agencies and acts, such as those from the EPA, OSHA and NEPA, have set regulations to help advance sustainability initiatives that promote positive influence on the social, economic, and environmental aspects indicating the importance of sustainability to ensure a future for this planet.

Impact of calibration on plant sustainability

Process instrumentation exists to monitor how much, how high, how little and how often, contents are being used to create a product. Calibrating process instrumentation adheres to the social, economic, and environmental pillars of sustainability. As mentioned above, social sustainability includes the safety of the employees and community.  For example, toxic gas emissions are  monitored  by process instrumentation which must be calibrated and the results documented to ensure  accurate readings required  by regulatory agencies, such as the EPA and OSHA. 

Using a calibration software can help improve plant sustainability in many ways. For instance, calibration software can remind plant personnel when instruments are due for calibration, reducing the chance for these instruments to be overlooked, which if not calibrated could be out of tolerance, causing the process to emit harmful chemicals into the atmosphere which could negatively affect the environment and be deadly or harmful to the community. Calibration  helps  to  ensure  proper  function, reliability and accuracy of instrumentation.

An automated, integrated calibration program can integrate with maintenance  management  systems (MMS),  to help increases quality and decrease the time and money spent on calibration, especially when compared to manual systems, such as pen and paper.  Many plants receive work orders from the MMS queuing them to perform calibration. Traditionally, results were written down using paper and pen then inserted into several databases,  once  in  a calibration database  and  once  in an  MMS. This manual process can take hours of work and is prone to errors, while a calibration software saves a considerable number of man-hours and enhances data integrity. Streamlined calibration processes have fast returns on investment and secure plant profit by catching potential failures before they cause unplanned shutdowns, thus making a plant more sustainable for the future.

Beamex solution

Not  only  does  calibration  promote  sustainability,  but  Beamex  calibration  solutions  are  manufactured  with  sustainability  in  mind  as  well.  Beamex’s  product  development and production teams have received training on the environmental impact of product design. Beamex products are also designed to have a long operating life – typically a customer uses a Beamex calibrator for over ten years. This minimizes the waste generated from the products.

The  Beamex  production  process  follows  the  Waste  Electrical  and  Electronic  Equipment  (WEEE)  directive  2002/96/EC that sets collection, recycling and recovery targets for electrical goods and is part of a European Union legislative initiative to solve the problem of huge amounts of toxic electronic waste. Beamex also takes into consideration the Eco Vadis and ISO 14001 environmental standards in their ISO 9001 quality system.

 

References

Larson, Keith. “Why Sustainability Now?” Sustainable Plant. Putnam Media. 2013. Web. 26 March 2013. <http://www.sustainableplant.com/about-us/

 

Topics: sustainability

How to calibrate temperature sensors

Posted by Heikki Laurila on Aug 27, 2019

How to calibrate temperature sensors - Beamex blog post

 

Temperature measurement is one of the most common measurements in the process industry.

Every temperature measurement loop has a temperature sensor as the first component in the loop. So, it all starts with a temperature sensor. The temperature sensor plays a vital role in the accuracy of the whole temperature measurement loop.

As any measurement instrument you want to be accurate, also the temperature sensor needs to be calibrated regularly. Why would you measure temperature, if you don’t care about the accuracy?

In this blog post, I will take a look at how to calibrate temperature sensors and what are the most common things you should consider when calibrating temperature sensors.

Download this article as free pdf file

 

Before we get into details, here is a short video on how to calibrate a temperature sensor:

 

What is a temperature sensor?

Let's start from the basics... discussing what a temperature sensor is: 

As the name indicates, a temperature sensor is an instrument that can be used to measure temperature. It has an output signal proportional to the applied temperature. When the temperature of the sensor changes, the output will also change accordingly.

There are various kinds of temperature sensors that have different output signals. Some have a resistance output, some have a voltage signal, some have a digital signal and many more.

In practice, in industrial applications, the signal from temperature sensor is typically connected to a temperature transmitter, that will convert the signal into a format that is easier to transfer for longer distances, to the control system (DCS, SCADA). The standard 4 to 20 mA signal has been used for decades, as a current signal can be transferred longer distances and the current does not change even if there is some resistance along the wires. Nowadays transmitters with digital signals or even wireless signals are being adopted.

Anyhow, to measure temperature, the measuring element that is used is the temperature sensor.

 

Measuring the temperature sensor output

As most temperature sensors have an electrical output, that output obviously needs to be measured somehow. That being said, you need to have a measurement device to measure the output, resistance or voltage, for example. 

The measurement device often displays an electrical quantity (resistance, voltage), not temperature. So it is necessary to know how to convert that electrical signal into a temperature value.

Most standard temperature sensors have international standards that specify how to calculate the electrical/temperature conversion, using a table or a formula. If you have a non-standard sensor, you may need to get that information from the sensor manufacturer.

There are also measuring devices that can display the temperature sensor signal directly as temperature. These devices also measure the electrical signal (resistance, voltage) and have the sensor tables (or polynomials/formulas) programmed inside, so they convert it  into temperature. For example, temperature calibrators typically support the most common RTD (resistance temperature detector) and thermocouple (T/C) sensors used in the process industry.

 

So how to calibrate a temperature sensor?

Before we go into the various things to consider when calibrating a temperature sensor, lets take a look at the general principle.

First, since the temperature sensor measures temperature, you will need to have a known temperature to immerse the sensor in to calibrate it. It is not possible to “simulate” temperature, but you must create a real temperature using a temperature source.

You can either generate an accurate temperature, or you can use a calibrated reference temperature sensor to measure the generated temperature. For example, you may insert the reference sensor and the sensor to be calibrated into a liquid bath (preferably a stirred one) and you can perform calibration at that temperature point. Alternatively, a so called dry-block temperature source can be used.

As an example, using a stirred ice-bath provides pretty good accuracy for the 0 °C (32°F) point calibration.

For industrial and professional calibration, typically temperature baths or dry-blocks are used. These can be programmed to heat or cool the temperature into a certain set point.

In some industrial applications, it is a common practice to replace temperature sensors on regular intervals and not to calibrate the sensors regularly.

 

How to calibrate temperature sensors – things to consider

Lets start digging into the actual calibration of temperature sensors and the different things to consider....

 

1 - Handling temperature sensor

Different sensors have different mechanical structures and different mechanical robustness.

The most accurate SPRT (standard platinum resistance thermometer) sensors, used as reference sensors in temperature laboratories, are very fragile. Our temperature calibration laboratory people say that if a SPRT touches something so that you can hear any sound, the sensor must be checked before any further use.

Luckily most of the industrial temperature sensors are robust and will survive normal handling. There are some industrial sensors that are made very robust and then can withstand pretty rough handling.

But if you are not sure of the structure of the sensor you should calibrate, it is better to be safe than sorry.

It’s never wrong to handle any sensor as if it was a SPRT.

In addition to mechanical shocks, a very fast change in temperature can be a chock to the sensor and damage it or affect the accuracy.

Thermocouples are typically not as sensitive as RTD probes.

 

2 - Preparations

There are normally not that many preparations, but there are some things to take into account. First, a visual inspection is performed in order to see that the sensor looks ok and make sure it has not been bent or damaged, and that the wires look ok.

External contamination can be an issue, so it is good to know where the sensor has been used and what kind of media it has been measuring. You may need to clean the sensor before calibration, especially if you plan to use a liquid bath for calibration.

The insulation resistance of an RTD sensor can be measured in prior to calibration. This is to make sure that the sensor is not damaged and the insulation between the sensor and the chassis is high enough. A drop in insulation resistance can cause error in measurements and is a sign of a sensor damage.

 

3 - Temperature source

As mentioned, you need to have a temperature source to calibrate a temperature sensor. It is just not possible to simulate temperature.

For industrial purposes, a temperature dry-block is most commonly used. It is handy and portable and typically accurate enough.

For higher accuracy needs, a liquid bath can be used. That is anyhow not typically easily portable but can be used in laboratory conditions.

For zero Centigrade point, a stirred ice-bath is often used. It is pretty simple and affordable yet provides a good accuracy for the zero point.

For the most accurate temperatures, fixed-point cells are being used. Those are very accurate, but also very expensive. Those are mostly used in accurate (and accredited) temperature calibration laboratories.

 

4 - Reference temperature sensor

The temperature is generated with some of the heat sources mentioned in the previous chapter. You obviously need to know with a very high degree of accuracy the temperature of the heat source. Dry-blocks and liquid baths offer an internal reference sensor that measures the temperature. But for more accurate results, you should be using a separate accurate reference temperature sensor that is inserted in the same temperature as the sensor(s) to be calibrated. That kind of reference sensor will more accurately measure the temperature that the sensor to be calibrated is measuring.

Naturally the reference sensor should have a valid traceable calibration.  It is easier to send a reference sensor out for calibration than sending the whole temperature source (it is good also to keep in mind the temperature gradient of the temperature block if you always only have the reference sensor calibrated not the block).

As for thermodynamic characteristics, the reference sensor should be as similar as possible compared to the sensor to be calibrated, to ensure they behave the same way during temperature changes.

The reference sensor and sensor to be calibrated should be immersed in the same depth in the temperature source. Typically, all sensors are immersed to the bottom of a dry-block. With very short sensors, it gets more difficult as they will only immerse a limited depth into the temperature source, and you should make sure that your reference sensor is immersed equally deep. In some cases, this requires a dedicated short reference sensor to be used.

Using fixed-point cells, you don’t need any reference sensor, because the temperature is based on physical phenomena and is very accurate by its nature.

 

5 - Measuring the temperature sensor output signal

Most temperature sensors have an electrical output (resistance or voltage) that needs to be measured and converted to temperature. So, you need to have some device to be used for the measurement. Some temperature sources offer also a measurement channels for the sensors, both device under test (DUT) and reference.

If you measure the electrical output, you will need to convert that into temperature, using international standards. In most industrial cases, you will use a measurement device that can do the conversion for you, so you can see the signal conveniently in the temperature unit (Centigrade or Fahrenheit).

What ever means you use for the measurement, make sure you know the accuracy and uncertainty of the device and ensure it has valid traceable calibration.

 

6 - Immersion depth

Immersion depth (how deep you insert the sensor into temperature source) is one important consideration when calibrating temperature sensors.

Our temperature calibration lab people gave this rule of thumb when using a stirred liquid bath:

  • 1% accuracy - immerse 5 diameters + length of the sensing element
  • 0.01% accuracy - immerse 10 diameters + length of the sensing element
  • 0.0001% accuracy - immerse 15 diameters + length of the sensing element

Heat conduction in a stirred liquid bath is better than in a dry-block and the required immersion depth is smaller.

For dry-blocks, there is an Euramet recommendation that you should immerse 15 times the diameter of the sensor added with the length of the sensor element. So, if you have a 6 mm diameter sensor, which has a 40 mm element inside, you immerse it (6 mm x 15 + 40 mm) 130 mm.

Sometimes it is difficult to know how long the actual element is inside the sensor, but it should be mentioned in the sensor specifications.

Also, you should be aware of where the sensor element is located (it is not always in the very tip of the sensor).

The sensor to be calibrated and the reference sensor should be immersed into the same depth so that the middle points of the actual sensor elements are in the same depth.

Naturally with very short sensors, it is not possible to immerse them very deep. That is one reason for the high uncertainty when calibrating short sensors.

 

7 - Stabilization

Remember that a temperature sensor always measures its own temperature!

Temperature changes pretty slowly and you should always wait long enough to have all parts stabilized to the target temperature. When you insert the sensor into a temperature, it will always take some time before the temperature of the sensor has reached that temperature and stabilized.

Your reference sensor and the sensor to be calibrated (DUT) may have very different thermodynamic characteristics, especially if they are mechanically different.

Often one of the biggest uncertainties related to temperature calibration can be that the calibration is done too quickly.

If you most often calibrate similar kinds of sensors, it is wise to make some type tests to learn the behavior of those sensors.

 

8 - Temperature sensor handle

The sensor handle part, or the transition junction, typically has a limit of how hot it can be. If it is heated too hot, the sensor may be damaged. Make sure you know the specifications of the sensors you calibrate.

If you calibrate in high temperatures, it is recommended to use a temperature shield to protect the sensor handle.

 

9 - Calibrated temperature range

With temperature sensors, it is pretty common that you don’t calibrate the whole temperature range of the sensor.

The very top of the range is something you should be careful in calibrating. For example, a RTD sensor can drift permanently if you calibrate it in too high temperature.

Also, the coldest points of the sensor’s temperature range can be difficult/expensive to calibrate.

So, it is recommended to calibrate the temperature range that the sensor is going to be used in.

 

10 - Calibration points

In industrial calibration, you need to pick enough calibration points to see that the sensor is linear. Often it is enough to calibrate 3 to 5 points throughout the range.

Depending on the sensor type, you may need to take more points, if you know that the sensor may not be linear.

If you calibrate platinum sensors and you plan to calculate coefficients based of the calibration results, you will need to calibrate at suitable temperature points to be able to calculate the coefficients. The most common coefficients for the platinum sensors are the ITS-90 and Callendar van Dusen coefficients. For thermistors, Steinhart-Hart coefficients can be used.

When sensors are calibrated in an accredited laboratory, the points may also be selected based on the lab’s smallest uncertainty.

 

11 - Adjusting / trimming a temperature sensor

Unfortunately, most temperature sensors can not be adjusted or trimmed. So, if you find an error in calibration, you cannot adjust that. Instead you will need to use coefficients to correct the sensor’s reading.

In some cases, you can compensate the sensor error in other parts of the temperature measurement loop (in transmitter or in DCS).

 

Other things to consider

Documentation

As with any calibration, the temperature sensor calibration needs to be documented in a calibration certificate.

 

Traceability

In calibration, the reference standard used, must have a valid traceability to National Standards, or equivalent. The traceability should be an unbroken chain of calibrations each having stated uncertainties.

More info on metrological traceability, please see the blog post Metrological Traceability in Calibration – Are you traceable?

 

Uncertainty

As always in calibration, also in temperature sensor calibration, you should be aware of the total uncertainty of the calibration process. In temperature calibration the calibration process (the way you do the calibration) can easily be by far the biggest uncertainty component in the total uncertainty.

More information on calibration uncertainty, please see the blog post Calibration uncertainty for dummies

 

Automating the calibration

Temperature calibration is always a pretty slow operation since temperature changes slowly and you need to wait for the stabilization. You can benefit a lot, if you can automate your temperature calibrations. The calibration will still take long time, but if it is automated, you don't need to be there to wait for it.

This will naturally save time and money for you.

Also, when automated, you can be sure that the calibration gets always done the same way.

 

Download free white paper

Click the picture below to download this article as a free pdf file:

How to calibrate temperature sensors - Beamex blog post

 

Other related blogs

If you found this blog post interesting, you may also like these ones listed below. Please feel free to browse all the articles in the Beamex blog, maybe you find some interesting articles to read.

 

Beamex solutions for temperature calibration

Please check out the new Beamex MC6-T temperature calibrator, that is a perfect tool for temperature sensor calibration and for much more. Click the below picture to read more:

Beamex MC6-T temperature calibrator

Please check also what else Beamex can offer you for temperature calibration or for temperature calibration services.

 

Temperature Calibration eLearning

Free eLearning course on industrial temperature calibration.

Master temperature calibration with this free comprehensive eLearning course from Beamex. Deepen your knowledge, pass the quiz, and earn your certificate!

Read more and enroll >

 

Temperature Sensor Calculator

A free tool to easily convert between temperature and electrical signals for thermocouples and RTD sensors. 

https://www.beamex.com/resources/temperature-sensor-calculator/

 

Thanks to our accredited temperature calibration laboratory persons for their help in making this article. Special thanks to Mr. Toni Alatalo, the head of our accredited temperature laboratory!

 

Topics: Temperature calibration

Why use calibration software?

Posted by Chuck Boyd on Jul 24, 2019

The shortest answer is to automate documentation to save time, lower risks, and quickly analyze data to make better decisions.

Beamex CMX with Beamex MC6 and bMobile

Most process plants have some sort of system in place for managing instrument calibration operations and data. However, just like airport security, the systems and processes can be very different even within the same company across different plants. Methods often differ greatly in terms of cost, quality, efficiency, and accuracy of data and the level of automation.

If you are manually writing results on paper or even manually entering data electronically, you’re spending about half your time on paperwork.  Using a documenting calibrator to automatically transfer test data to calibration software designed for the task can decrease the amount of time spent on calibration in many cases by up to 75%.

If you’re thinking about leaving the paper documentation lifestyle, using calibration software should be the ultimate goal.  On your way there, you could store results in a spreadsheet or generic database.  That will get you paperless, but it won’t realize all the benefits. The risk of human error and compromised data integrity will still be high and data entry will be time consuming.  It won’t automate updating calibration due dates like software designed for the job.  Here’s a secret you may not know—many people still write down calibrations on paper. They think that they are the only ones and are usually embarrassed at the thought and hesitant to reach out for help. If this is you, know you are not alone.  Start by reading this blog post and asking for help!

Paper-based systems

Beamex integrated calibration diagram

Traditionally, engineers and technicians used pen and paper to record calibration results while out in the field. On returning to the shop, notes are tidied up or transferred to another paper document, after which they are archived as paper documents. Inherent in managing hundreds or even thousands of pieces of paper is the eventual misplaced, lost or damaged document.

While using a manual, paper-based system requires little or no investment, it is very labor-intensive and means that historical trend analysis becomes very difficult to carry out.

In addition, the calibration data is not easily accessible. The system is time consuming, soaks up a lot of resources and typing errors are commonplace. Dual effort and re-keying of calibration data are also significant costs here.

In-house legacy systems (spreadsheets, databases, etc.)

Although certainly a step in the right direction, using an in-house legacy system to manage calibrations has its drawbacks. In these systems, calibration data is typically entered manually into a spreadsheet or database. The data is stored in electronic format, but the recording of calibration information is still time-consuming and typing errors are common. Also, the calibration process itself cannot be automated. For example, automatic alarms cannot be set up on instruments that are due for calibration.

Calibration module of a maintenance management software

Some use the calibration module of their maintenance management software for calibration management. Plant hierarchy and work orders can be stored in the it, but the calibration cannot be automated because the system is not able to communicate with ‘smart’ calibrators.

Furthermore, these softwares are not designed to manage calibrations and so often only provide the minimum calibration functionality, such as the scheduling of tasks and entry of calibration results. Although instrument data can be stored and managed efficiently in the plant’s database, the level of automation is still low. In addition, the system may not meet the regulatory requirements (e.g. FDA or EPA) for managing calibration records.

Calibration Software 

With calibration software, users are provided with an easy-to-use Windows Explorer-like interface. The software manages and stores all instrument and calibration data. This includes the planning and scheduling of calibration work; analysis and optimization of calibration frequency; production of reports, certificates and labels; communication with smart calibrators; and easy integration with maintenance management systems such as SAP and Maximo. The result is a streamlined, automated calibration process, which improves quality, plant productivity, safety and efficiency.

Calibration software is the most advanced solution available to support and guide calibration management activities. In order to understand how software can help you better manage process plant instrument calibrations, it is important to consider the typical calibration management tasks that companies undertake. There are five main areas here, comprised of planning and decision-making, organization, execution, documentation, and analysis.

Planning and decision-making

All plant instruments and measurement devices should be listed, then classified into ‘critical’ and ‘non-critical’ devices. Once these have been set up, the calibration ranges and required tolerances should be identified. Decisions then need to be made regarding the calibration interval for each instrument. The creation and approval of standard operating procedures (SOPs) for each device should be defined, followed by the selection of suitable calibration methods and tools for execution of these methods. Finally, the current calibration status for every instrument across the plant should be identified.

Organization

The next stage, organization, involves training the company’s calibration staff – typically maintenance technicians, service engineers, process and quality engineers and managers – in using the chosen tools and how to follow the approved SOPs. Resources should be made available and assigned to actually carry out the scheduled calibration tasks.

Execution

The execution stage involves supervising the assigned calibration tasks. Staff carrying out these activities must follow the appropriate instructions before calibrating the device, including any associated safety procedures. The calibration is then executed according to the plan, although further instructions may need to be followed after calibration.

The documentation and storage of calibration results typically involves electronically signing or approving all calibration records generated.

Based on the results, analysis should be performed to determine if any corrective action needs to be taken. The effectiveness of calibration needs to be reviewed and calibration intervals checked. These intervals may need to be adjusted based on archived calibration history. If, for example, a sensor drifts out of its specification range, the consequences could be disastrous for the plant, resulting in costly production downtime, a safety problem or leading to batches of inferior quality goods being produced, which may then have to be scrapped.

Documentation

Documentation is a very important part of a calibration management process. Many regulatory agencies and auditors require that records are maintained and must be carried out according to written, approved procedures. Without implicit documentation proving traceability of measurement standards used the result, by definition, cannot be considered calibration.

An instrument engineer can spend as much as 50% of his or her time on documentation and paperwork – time that could be better spent on other value-added activities. This paperwork typically involves preparing calibration instructions to help field engineers; making notes of calibration results in the field; and documenting and archiving calibration data.

Imagine how long and difficult a task this is if the plant has thousands of instruments that require calibrating on at least a six month basis? The amount of manual documentation increases almost exponentially!

Any type of paper-based calibration system will be prone to human error. Noting down calibration results by hand in the field and then transferring these results into a spreadsheet back at the office may seem archaic, but many firms still do this. However, with regulatory agencies requiring data integrity procedures, many are turning digital. Furthermore, analysis of paper-based systems and spreadsheets can be almost impossible, let alone time consuming.

Analysis

Calibration history trend

(Example of automatically generated history trend above)

Using a specialist calibration management software  enables faster, easier and more accurate analysis of calibration records and identifying historical trends.

Plants can therefore reduce costs and optimize calibration intervals by reducing calibration frequency when this is possible, or by increasing the frequency where necessary.

For example, for improved safety, a process plant may find it necessary to increase the frequency of some sensors that are located in a hazardous, potentially explosive area of the manufacturing plant.

Just as important, by analyzing the calibration history of a flow meter that is located in a ‘non-critical’ area of the plant, the company may be able to decrease the frequency of calibration, saving time and resources. Rather than rely on the manufacturer’s recommendation for calibration intervals, the plant may be able to extend these intervals by looking closely at historical trends provided by calibration management software. Instrument ‘drift’ can be monitored closely over a period of time and then decisions made confidently with respect to amending the calibration interval.

Benefits of Using Calibration Software

Beamex Calibration Certificate

(Example calibration certificate above)

With software-based calibration management, planning and decision-making are improved. Procedures and calibration strategies can be planned and all calibration assets managed by the software. Instrument and calibrator databases are maintained, while automatic alerts for scheduled calibrations can be set up.

Organization also improves. The system no longer requires pens and paper. Calibration instructions are created using the software to guide engineers through the calibration process. These instructions can also be downloaded to a technician’s handheld documenting calibrator while he is in the field.

Execution is more efficient and errors are eliminated. Using software-based calibration management systems in conjunction with documenting calibrators means that calibration results can be stored in the calibrator’s memory, then automatically uploaded back to the calibration software. There is no re-keying of calibration results from a notebook to a database or spreadsheet. Human error is minimized and engineers are freed up to perform more strategic analysis or other important activities.

Documentation is easier. The software generates reports automatically and all calibration data is stored in one database rather than multiple disparate systems. Calibration certificates, reports and labels can all be printed out on paper or sent in electronic format.

Analysis becomes easier too, enabling engineers to optimize calibration intervals using the software’s trending function. Also, when a plant is being audited, calibration software can facilitate both the preparation and the audit itself. Locating records and verifying that the system works is effortless when compared to traditional calibration record keeping. Regulatory organisations and standards such as FDA and EPA place demanding requirements on the recording of calibration data. Calibration software has many functions that help in meeting these requirements, such as Change Management, Audit Trail and Electronic Signature functions.

 

Business Benefits

For the business, implementing software-based calibration management means overall costs will be reduced. These savings come from fully digitized calibration procedures, now paperless  with no manual documentation. Engineers can analyze calibration results to see whether the calibration intervals on plant instruments can be optimized. For example, those instruments that perform better than expected may well justify a reduction in their calibration frequency.

Plant efficiencies should also improve, as the entire calibration process is now streamlined and automated. Manual procedures are replaced with automated, validated processes, which is particularly beneficial if the company is automating a lot of labour-intensive calibration activities. Costly production downtime will also be reduced.

Even if a plant has already implemented a maintenance management software, calibration management software can be easily integrated to this system. If the plant instruments are already defined on a database, the calibration management software can utilize the records available in the system database.

The integration will save time, reduce costs and increase productivity by preventing unnecessary double effort and rekeying of work orders in multiple systems. Integration also enables the plant to extend automated data acquisition to their ERP system with smart calibrators, which simply is not possible with a standalone system.

Beamex Solutions

CMX group photo croppedBeamex’s suite of calibration management software can benefit all sizes of process plant. For relatively small plants, where calibration data is needed for only one location, only a few instruments require calibrating and where regulatory compliance is minimal, Beamex LOGiCAL calibration software maybe the most appropriate.

Companies that have medium to large amount of instruments and calibration work or strict regulatory compliance, Beamex CMX Professional is ideal. It fulfills the requirements of 21 CFR Part 11 and other relevant regulations for electronic records, electronic signatures and data integrity. It also offers Mobile Security Plus, which provides enhanced functionality with compatible offline mobile devices, such as Beamex MC6 family of documenting calibrators and tablets/smartphones with the Beamex bMobile  application. This enhancement further lowers the risk of ALCOA violations by identifying those using offline mobile devices by their electronic signature and by protecting the offline data against tampering.

bMobile-1Along with CMX, the Beamex bMobile application allows for paperless execution and documentation of inspection activities in the field. It works offline as well, which is ideal where reliable network connections are not available. bMobile also supports Beamex’s “Mobile Security Plus” technology, a solution to ensure the integrity of calibration data throughout the entire process.

Beamex’s multi-site solution, CMX Enterprise, is suitable for process manufacturers with multiple or global sites, multilingual users and a very large amount of instruments that require calibration. Here, a central calibration management database is often implemented, which is used by multiple plants across the world.

Please see also our Calibration Software main page. 

Summary

Every type of process plant, regardless of industry sector, can benefit from using calibration management software. Compared to traditional, paper-based systems, in house built legacy calibration systems or calibration modules of maintenance management systems, using dedicated calibration management software results in improved quality, increased productivity and reduced costs of the entire calibration process.

Calibration Software key benefits:

  • Better Planning & Decision-Making
  • Easier organization
  • Faster Execution
  • Automated Documentation
  • Analysis capabilities
  • Cost reduction
  • Quality improvements
  • Increase in efficiency

 

Download your copy of the Calibration Essentials Software eBook, to learn more about calibration management and software.

Calibration Essentials- Software eBook

 

Topics: Calibration software

Calibration uncertainty and why technicians need to understand it [Webinar]

Posted by Heikki Laurila on Jun 27, 2019

Calibration uncertainty and why technicians need to understand it - Beamex webinar

In this blog post, I want to share a webinar titled "Calibration uncertainty and why technicians need to understand it." This webinar was a collaboration with Beamex and ISA (International Society of Automation) subject matter experts.

It describes a practical approach to calibration uncertainty and  also provides real-life applications.

The webinar speakers are Beamex's own "dynamic duo" Ned and Roy. Both Ned Espy and Roy Tomalino have long careers with Beamex and have many year's experience with calibration.

Please click the picture below (under the topic list) to watch the webinar.

To make it easier for you to jump to a relevant part of the webinar, below is a list of the main topics and the time.

Calibration uncertainty and why technicians need to understand it - content list

 

Watch the webinar by clicking the picture below:

Calibration uncertainty webinar - Beamex

 

More on calibration uncertainty

For additional information on calibration uncertainty, please check out the blog post Calibration uncertainty for Dummies. 

Other webinar blog posts

If you like webinars, here are some related ones we have posted in our blog:

Products used in the webinar demo sessions

In the demo sessions, Roy is using Beamex MC6 Calibrator and Beamex CMX Calibration Management software.

 

 

Topics: calibration uncertainty

How a business analyst connected calibration and asset management [Case Story]

Posted by Heikki Laurila on May 14, 2019

Beamex blog post - Picture of SRP site - How a business analyst connected calibration and asset management [Case Story]

How one of America’s largest public power utilities integrated its asset management software with calibration software to reduce risk and increase efficiency

We haven't earlier shared any customer case stories in the this blog. It seems to be anyhow interesting for people  to read what other companies have done, and learn the best practices from them. Hopefully you find this story useful. Let's dig into it: 

Salt River Project (SRP)

For more than a century, Salt River Project (SRP) has produced power and delivered water to meet the needs of its customers in the greater Phoenix metropolitan area. Today, as one of the nation's largest public power utilities, SRP provides reliable electricity and water to more than 1 million customers and employs around 4,500 people.

Jody Damron

Jody DaJody Damron, a Business Analyst at Salt River Project.mron, a Business Analyst at Salt River Project’s corporate headquarters in Tempe, Arizona, has been serving the company for more than 40 years and has helped develop Salt River Project’s calibration processes. Several years ago, he started to investigate the possibility of linking their calibration software, Beamex CMX, to their asset management software, Maximo.   

IT Projects

Jody began by researching IT integration projects. He was soon amazed to discover the mind-boggling number of failed projects, costing companies up into the trillions of dollars. He read about major failures where no progress was made, even situations in which companies were forced to go back to the original way after failed attempts. He declared, right then and there that, “failure is not an option.”

Project Team

Through a preliminary analysis, he concluded that an integration project would require a substantial amount of planning and input from a team of internal departmental experts to ensure that it functioned appropriately for all parties involved. He also knew the external parties, or vendors, would be just as vital to their success.

 

Beamex blog post - How a business analyst connected calibration and asset management - Salt River Project project team

It was important that he put together a quality team (Fig. 1) that he trusted, because he knew he had to rely on everyone’s input and expertise. During this process, he learned important lessons about building a successful team. Jody soon discovered how each party tended to speak different technical languages as well as have different goals and ideology. He determined that communication was going to be the key to success. Jody explains, “the business will say they need an apple cut in 6 pieces and the IT side will hear cut a watermelon in half. Technical, cultural and language communication barriers are real challenges that needed full attention."

He knew they would run into many implementation roadblocks if the team did not work together during the entire process. The team stayed focused on the detailed requirements and met often to review the business expectations.

 

Responsibilities of vendors and customer

As important as it is for the entire project team to understand everyone’s roles and responsibilities to ensure efforts weren’t duplicated or missed altogether, it was also essential to define the roles of the vendors and establish clear operation guidelines. The following chart (Fig. 2) defines responsibilities along with brief descriptions for some of the sector’s key duties:

Beamex blog post - How a business analyst connected calibration and asset management - Salt River Project team roles.

 

  • Business: Data integrity is an important and an ongoing process. For SRP, it has never stopped since it first began in 1974. It is a time consuming, but important process – one which can go south in a very short period of time if it is not continually monitored. SRP put a lot of man hours into ensuring clean data. 
  • Beamex CMX calibration software: SRP relied on Beamex’s expertise. Beamex acted as consultants and were quick to communicate how the integration could work most efficiently and made no empty promises.
  • Maximo: The Maximo team worked hand in hand with SRP technicians to meet business expectations and functionality requirements.
  • Integration: It was imperative to make sure the right data was transferred back and forth between systems in the correct manner.

After analyzing all of these factors and gathering information from the project team, risks had to be considered so that Jody could be 100% confident that the integration would be successful. After all, failure was not an option.

 

How it works today

Upon completion of in-depth analysis by the team, Jody determined that the integration could be completed to meet both the business and IT needs. As Jody eloquently puts it, “it’s extremely simple, if you think of how complicated it could be.” 

These are the basic rules used to form SRP’s system:

  1. Beamex CMX is the calibration system of record that stores the detailed calibration information. 
  2. Maximo tracks all plant assets and is the master of scheduling.
  3. As for calibration, the only information Maximo needs is if an instrument passed or failed during the calibration event. 
  4. In Maximo, there are two types of instrument assets. The first type are regular instrument assets that are never calibrated, for example an orifice plate. Secondly, there are calibrate-able assets, for example a transmitter.
  5. For a Maximo asset to be transferred into CMX, the asset has to be defined as a calibrate-able asset. Out of 28,000 instruments, there are 7,700 assets that require calibration and meet the calibrate-able asset 
  6. If a Maximo work order is written or automatically generated by the preventive maintenance application for a calibrate-able asset, it automatically flows into CMX. This is critical because the rules create a built-in method of security that does not allow “garbage” data to be transferred back and forth. This ensures good data integrity for both software platforms. If a work order is not for a calibrate-able asset, it does not go to CMX.
  7. Work orders are generated by a planner. Technicians will paperlessly pick them up and calibrate them. This process allows field personnel to work only within CMX, and they do not deal with work orders in Maximo, saving them time, money and frustration.

 

For example, during a typical unit overhaul, many of the site’s 7,700 calibrate-able instrument assets need to be tested. Work orders are planned, put into progress, the information is automatically transferred to CMX and the technician is alerted by the planner via email. The technician can then download the asset test information to a Beamex MC6 calibrator & communicator and perform the necessary work. Since the MC6 is a multifunction, documenting calibrator, the entire calibration process is automated because the results are stored in the calibrator’s memory. When the technician returns to the shop, they upload results into CMX. When a calibration test passes, an automatic notification is sent back into Maximo that closes the work order and documents who performed the work and when it was done. A failure requires the initiation of a follow up work order.

 

Summary and the results

The most significant impact overall is that Salt River Project has been able to save about 30 minutes per calibration using an automated approach. This equates up to 1,000 man-hours in the previously cited unit overhaul example. Further savings are anticipated as history analysis will confirm that extended calibration intervals are recommended. It is important to note that SRP’s work order history for calibration is 100% automated and technicians never work in Maximo. Other major benefits of the automated calibration system include:

  • System oversight has been minimized.
  • Audits are easy to perform and are less stressful.
  • Defined calibration procedures provide a corporate “best practices” approach to calibration.
  • Better decision making because of accurate data.

In the simplest terms, the new Beamex/Maximo calibration system gives back time to the people working in the field. As a result, as Jody explains, “With this software integration project, we were able to realize a significant return on investment during the first unit overhaul. It’s unusual, since ROI on software projects is usually nonexistent at first.

 

Download a pdf 

Download a pdf of this story >> 

 

Read more case stories

To read more case stories like this, click the link below:

Read more case stories >>

 

Read more about Beamex CMX Calibration Management Software

 

Topics: Calibration software, Case Story

Optimal Calibration Parameters for Process Instrumentation

Posted by Ned Espy on Apr 24, 2019

Optimal Calibration Parameters for Process Instrumentation

Many calibration technicians follow long-established procedures at their facility that have not evolved with instrumentation technology. Years ago, maintaining a performance specification of ±1% of span was difficult, but today’s instrumentation can easily exceed that level on an annual basis. In some instances, technicians are using old test equipment that does not meet new technology specifications. This article focuses on establishing base line performance testing where analysis of calibration parameters (mainly tolerances, intervals and test point schemes) can be analyzed and adjusted to meet optimal performance. Risk considerations will also be discussed – regulatory, safety, quality, efficiency, downtime and other critical parameters. A good understanding of these variables will help in making the best decisions on how to calibrate plant process instrumentation and how to improve outdated practices.

 

Introduction

A short introduction to the topics discussed in this post: 

How often to calibrate?

The most basic question facing plant calibration professionals is how often should a process instrument be calibrated? There is not a simple answer, as there are many variables that effect instrument performance and thereby the proper calibration interval, these include:

  • Manufacturer’s guidelines (a good place to start)
  • Manufacturer’s accuracy specifications
  • Stability specification (short term vs. long term)
  • Process accuracy requirements
  • Typical ambient conditions (harsh vs. climate controlled)
  • Regulatory or quality standards requirements
  • Costs associated with a failed condition 
Pass/Fail tolerance

The next question for a good calibration program is what is the “Pass/Fail” tolerance? Again, there is no simple answer and opinions vary widely with little regard for what is truly needed to operate a facility safely while producing a quality product at the best efficiency. A criticality analysis of the instrument would be a good place to start. However, tolerance is intimately related to the first question of calibration frequency. A “tight” tolerance may require more frequent testing with a very accurate test standard, while a less critical measurement that uses a very accurate instrument may not require calibration for several years. 

Calibration procedures

What is the best way to determine and implement proper calibration procedures and practices is another question to be answered. In most cases, methods at a particular site have not evolved over time. Many times, calibration technicians follow practices that were set up many years ago and it is not uncommon to hear, “this is the way we have always done it.” Meanwhile, measurement technology continues to improve and is becoming more accurate. It is also getting more complex – why test a fieldbus transmitter with the same approach as a pneumatic transmitter? Performing the standard five-point, up-down test with an error of less than 1% or 2% of span does not always apply to today’s more sophisticated applications. As measurement technology improves, so should the practices and procedures of the calibration technician. 

Finding the optimum...

Finally, plant management needs to understand the tighter the tolerance, the more it will cost to make an accurate measurement. It is a fact that all instruments drift to some degree. It should also be noted that every make/model instrument has a unique “personality” for performance in a specific process application. The only true way to determine optimum calibration parameters is to somehow record calibration in a method that allows performance and drift to be analyzed. With good data and test equipment, the lowest, practical tolerance can be maintained while balancing that with an optimum schedule. Once these parameters are established, associated costs to perform a calibration can be estimated to see if there is justification to purchase a more sophisticated instrument with better performance specifications or purchase more accurate test equipment in order to achieve even better process performance.

Download this article as a free pdf file by clicking the picture below:

Optimal Calibration Parameters for Process Instrumentation - Beamex white paper

 

Calibration basics 

Optimum calibration interval

Determining a proper calibration interval is an educated guess based on several factors. A best practice is to set a conservative interval based on what the impact of a failure would be in terms of operating in a safe manner while producing product at the highest efficiency and quality. It is also important to review calibration methods and determine the best practices where there will be a minimal impact on plant operations. By focusing on the most critical instruments first, an optimum schedule can be determined and would allow for less critical testing if personnel have availability.

Since all instrumentation drifts no matter the make/model/technology, suppliers end up creating vastly different specifications making it difficult to compare performance. Many times there are several complicating footnotes written in less than coherent terminology. Instrument performance is not always driven by price. The only true way to determine an optimum interval is to collect data and evaluate drift for a specific make/model instrument over time.

Starting off with a conservative interval, after 3 tests, a clear drift pattern may appear. For example, a particular RTD transmitter is tested every three months. The second test indicates a maximum error drift of +0.065 % of span. The third test indicates another +0.060 % of span (+0.125% of span over 6 months). While more data should be used for analysis, a good guess is that this instrument drifts +0.25% per year. Statistically, more data equates to a higher confidence level. If this pattern is common among many of the same make/model RTD transmitters in use throughout the plant, the optimum calibration interval for ±0.50% of span tolerance could be set between 18 to 24 months with a very a relatively high level of confidence.

When collecting data on calibration, it is a good practice to not make unnecessary adjustments. For example, if the tolerance is ±1% of span and the instrument is only out by -0.25% of span, an adjustment should not be made. How can drift be analyzed (minimum of 3 points) with constant adjustment? For certain “personalities,” not adjusting can be a challenge (people strive for perfection), but note that every time an adjustment is made, drift analysis gets more difficult. In general, a best practice is to avoid adjusting until the error is significant. With a consistent schedule, a trim most likely will be needed on the next calibration cycle and not cause an As Found “Fail” condition. Of course, this may not be possible due to criticality, drift history, erratic scheduling or other factors, but when possible, do not automatically make calibration adjustments. 

What if the drift is inconsistent, both increasing, then decreasing over time? More analysis is required; for instance, are the ambient conditions extreme or constantly changing? Depending on the process application, instrument performance may be affected by media, installation, throughput, turbulence or other variables. This situation indicates there is a level of “noise” associated with drift. When this is the case, analysis should show there is a combination random error and systematic error. Random error consists of uncontrollable issues (ambient conditions and process application) vs. systematic error that consists of identifiable issues (instrument drift). By focusing on systematic error and/or clear patterns of drift, a proper calibration interval can be set to maximize operation efficiencies in the safest manner possible.

For more details, there is a dedicated blog post here: How often should instruments be calibrated? 

 

Setting the proper process tolerance limits 

Accuracy, Process Tolerance, Reject Error, Error Limit, Maximum Permissible Error, Error Allowed, Deviation, etc. – these are a few of the many terms used to specify instrument performance in a given process. Transmitter manufacturers always specify accuracy along with several more parameters associated with error (long term stability, repeatability, hysteresis, reference standard and more). When looking at setting a process tolerance, manufacturer accuracy offers a starting point, but it is not always a reliable number. Also, no measurement is better than the calibration standard used to check an instrument. What is behind a manufacturer’s accuracy statement in making an accurate instrument? For pressure, a good deadweight tester in a laboratory should be part of the formula.

At the plant level, a well-known simplified traditional rule of thumb is to have a 4:1 ratio for the calibrator’s uncertainty (total error) vs. the process instrument tolerance (TAR / TUR ratio).

Instead of using the simplified TAR/TUR ratio, the more contemporary approach is to always calculate the total uncertainty of the calibration process. This includes all the components adding uncertainty to the calibration, not only the reference standard.

To learn more on the calibration uncertainty, please read the blog post Calibration Uncertainty for Dummies.

When setting a process tolerance, a best practice is to ask the control engineer what process performance tolerance is required to make the best product in the safest way? Keep in mind the lower the number, the more expensive the calibration costs may be. To meet a tight tolerance, good (more expensive) calibration standards will be required. Also, another issue is to determine whether calibration should be performed in the field or in the shop. If instrumentation is drifting, a more frequent interval will need to be set to catch a measurement error. This may mean increased downtime along with the costs associated with making the actual calibration tests. As an example, review the three graphs of instrument performance:

 

Example 1 - Tolerance 0.1% of span:

Graph 1 -tolerance (0.1) 

Example 2 - Tolerance 0.25 % of span:

Graph 2 - tolerance (0.25)

Example 3 - Tolerance 1 % of span:

Graph 3 - tolerance (1)

Note the first graph above shows a failure (nearly double the allowed value), the second shows an adjustment is required (barely passing) and the third shows a transmitter in relative good control. The test data is identical for all 3 graphs, the only difference is the tolerance. Setting a very tight tolerance of ±0.1% of span can cause several problems: dealing with failure reports, constant adjustment adds stress to the calibration technician, operations does not trust the measurement, and more. Graph #2 is not much better, there is not a failure but 0.25% of span is still a tight tolerance and constant adjusting will not allow analysis of drift nor for evaluation of random error or systematic error. There are many benefits in #3 (note that ±1% of span is still a tight tolerance). If a failure were to occur, that would be an unusual (and likely a serious) issue. The calibration technician will spend less time disturbing the process and overall calibration time is faster since there is less adjusting. Good test equipment is available at a reasonable cost that can meet a ±1% of span performance specification.

There may be critical measurements that require a demanding tolerance and thereby accrue higher costs to support, but good judgements can be made by considering true performance requirements vs. associated costs. Simply choosing an arbitrary number that is unreasonably tight can cause more problems than necessary and can increase the stress level beyond control. The best approach would be to set as high a tolerance as possible, collect some performance data and then decrease the tolerance based on a proper interval to achieve optimum results. 

 

Calibration parameters

A subtle yet important detail is to review calibration procedures to see if further efficiencies can be gained without impacting the quality of the data. Years ago, technology was more mechanical in nature, board components were more numerous/complex, and instruments were more sensitive to ambient conditions. Today’s smart technology offers better accuracy and “brain power” with less, simplified components and with improved compensation capabilities. In many cases, old testing habits have not evolved with the technology. A good example is an older strain gauge pressure sensor that when starting from zero “skews” toward the low side as pressure rises due to the change from a relaxed state. Likewise, when the sensing element is deflected to its maximum pressure and as the pressure then decreases, there is a mechanical memory that “skews” the measure pressure toward the high end. This phenomenon is called hysteresis and graphically would resemble the graph below when performing a calibration:

 

5-point Up/Down calibration with hysteresis:

Graph 4 - calibration with hysteresis

 

Today’s smart pressure sensors are much improved, and hysteresis would only occur if something were wrong with the sensor and/or it is dirty or has been damaged. If the same calibration was performed on a modern sensor, the typical graphical representation would look like this:

 

5-point Up/Down calibration with no hysteresis:

Graph 5 - calibration  no hysteresis

 

This may look simple, but it takes significant effort for a calibration technician to perform a manual pressure calibration with a hand pump. Testing at zero is easy, but the typical practice is to spend the effort to hit an exact pressure test point in order to make an error estimate based on the “odd” mA signal. For example, if 25 inH2O is supposed to be exactly 8 mA, but 8.1623 mA is observed when the pressure is set to exactly 25 inH2O, an experienced technician knows he is dealing with a 1% of span error (0.1623 ÷ 16 x 100 = 1%). This extra effort to hit a “cardinal” test point can be time consuming, especially at a very low pressure of 25 in H2O. To perform a 9-point calibration, it might take 5 minutes or more and makes this example calibration unnecessarily longer. It is possible to perform a 5-point calibration and cut the time in half – the graph would look identical as the downward test points are not adding any new information. However, a pressure sensor is still mechanical in nature and, as mentioned, could have hysteresis. on a pressure transmitter. The quality of the test point data is equivalent to a 9-point calibration and if there is hysteresis, it will be detected. This also places the least stress on the technician as there are only 3 “difficult” test points (zero is easy) compared to 4 points for a 5-point calibration and 7 for a 9-point up/down calibration. Savings can be significant over time and will make the technician’s day to day work much easier.

Using this same approach can work for temperature instrumentation as well. A temperature sensor (RTD or thermocouple) is electromechanical in nature and typically does not exhibit hysteresis – whatever happens going up in temperature is repeatable when the temperature is going down. The most common phenomenon is a “zero shift” that is indicative of a thermal shock or physical damage (rough contact in the process or dropped). A temperature transmitter is an electronic device and with modern smart technology exhibits excellent measurement properties. Therefore, a best practice is to perform a simple 3-point calibration on temperature instrumentation. If calibrating a sensor in a dry block or bath, testing more than 3 points is a waste of time unless there is a high accuracy requirement or some other practical reason to calibrate with more points.

There are other examples of optimizing parameters. Calibration should relate to the process; if the process never goes below 100°C, why test at zero? When using a dry block, it can take a very long time to reach a test point of 0°C or below – why not set an initial test point of 5°C with an expected output of 4.8 mA, for example, if it is practical and will save time and make calibrating easier. Another good example is calibrating a differential pressure flow meter with square root extraction. Since a flow rate is being measured, output test points should be 8 mA, 12 mA, 16 mA and 20 mA, not based on even pressure input steps. Also, this technology employs a “low flow cut-off” where very low flow is not measurable. A best practice is to calibrate at an initial test point of 5.6 mA output (which is very close to zero at just 1% of the input span).

Do not overlook how specific calibrations are performed. Why collect unnecessary data? It is simply more information to process and can have a very significant cost. Why make the job of calibration harder? Look at the historical data and make decisions that will simplify work without sacrificing quality.

 

Calibration trend analysis and cost 

Temperature transmitter example

 As mentioned, the best way to optimize calibration scheduling is to analyze historical data. There is a balance of process performance vs. instrument drift vs. tolerance vs. optimum interval vs. cost of calibration and the only way to truly determine this is through historical data review. Using similar data for the temperature transmitter example in the Tolerance Error Limits section, apply the concepts to optimize the calibration schedule with this scenario:  

 

Calibration history trend example:

Graph 6 - Calibration history trend 

After a discussion between the Control Engineer and the I&C Maintenance group, a case was made for a tolerance of ±0.5% of span, but it was agreed that ±1% of span was acceptable until more information becomes available. This particular measurement is critical, so it was also agreed to calibrate every 3 months until more information becomes available. At the end of the first year, a drift of approximately +0.25% of span per year was observed and no adjustments were made. After further discussion, it was agreed to lower the tolerance to ±0.5% of span (the Control Engineer is happy) and to increase the interval to 6 months. An adjustment was finally made 1-1/2 years after the initial calibration. At the end of year 2, no adjustment was required and the interval was increased to 1 year. At the end of year 3, an adjustment was made, and the interval was increased to 18 months (now the Plant Manager, the I&C Supervisor and the I&C Technicians are happy). All this occurred without a single failure that might have required special reporting or other headaches.

Obviously this scenario is perfect, but if there are multiple instruments of the same make/model, strong trends will emerge with good historical data; affirming best practices and allowing best decisions to be made. For critical instrument measurements, most engineers are conservative and “over-calibrate”.  This example should open a discussion on how to work smarter, save time/energy and maintain a safe environment without compromising quality.

 

Cost of calibration

One other best practice is whenever possible, try to establish the cost to perform a given calibration and include this in the decision process. Consider not only the man hours, but the cost of calibration equipment, including annual recertification costs. When discussing intervals and tolerances, this can be very important information in making a smart decision. Good measurements cannot be made with marginal calibration equipment. As an example, in order to meet an especially tight pressure measurement tolerance, a deadweight tester should be used instead of a standard pressure calibrator – this is a huge step in equipment cost and technician experience/training. By outlining all the extra costs associated with such a measurement, a good compromise could be reached by determining the rewards vs. risks of performing more frequent calibrations with slightly less accurate equipment or by utilizing alternative calibration equipment.

Another overlooked operational cost is the critical need to invest in personnel and equipment. With either new technology or new calibration equipment, maintenance and/or calibration procedures should be reinforced with good training. ISA offers several excellent training options and consider local programs that are available for calibration technicians via trade schools or industry seminars. Finally, a review of calibration assets should be done annually to justify reinvestment by replacing old equipment. Annual recertification can be expensive, so when choosing new calibration equipment, look for one device that can possibly replace multiple items.

One other important cost to consider is the cost of failure. What happens when a critical instrument fails? If there are audits or potential shut-down issues, it is imperative to have a good calibration program and catch issues before they begin in order to avoid a lengthy recovery process. If calibration equipment comes back with a failed module, what is the potential impact on all the calibrations performed by that module in the past year? By understanding these risks and associated costs, proper decisions and investments can be made.

 

Conclusion

Obviously, not all instrumentation is going to offer easy analysis to predict drift. Also, calibration schedules get interrupted and many times work has to be done during an outage regardless of careful planning. In some areas there are regulatory requirements, standards or quality systems that specify how often instrument should be calibrated – it is difficult to argue with auditors. A best practice is to establish a good program, focusing on the most critical instruments. As the critical instruments get under control, time will become available to expand to the next level of criticality, and on and on.

Alternate or “hybrid” strategies should be employed in a good calibration management program. For example, loop calibration can lower calibration costs, which is performing end-to-end calibrations and only checking individual instruments when the loop is out. A good “hybrid” strategy is to perform a “light” calibration schedule combined with a less frequent “in-depth” calibration. As an example, make a minimally invasive “spot check” (typically one point) that has a lower tolerance than normal (use the recommended 2/3 of the normal tolerance value). Should the “spot check” fail, the standard procedure would be to perform the standard in-depth calibration to make necessary adjustments. A technician may have a route of 10 “spot checks” and end up only performing 1 or 2 in-depth calibrations for the entire route. Performing “spot checks” should still be documented and tracked, as good information about drift can come from this type of data.

To summarize, several best practices have been cited:

  • Set a conservative calibration interval based on what the impact of a failure would mean in terms of operating in a safe manner while producing product at the highest efficiency and quality.
  • Try not to make adjustments until the error significance exceeds 50% (or greater than ±0.5% for a tolerance of ±1% of span); this may be difficult for a technician striving for perfection, however, when unnecessary adjustments are made, drift analysis is compromised.
  • Ask the control engineer what process performance tolerance is required to make the best product in the safest way?
  • Set as high a tolerance as possible, collect some performance data and then decrease the tolerance based on a proper interval to achieve optimum results.
  • Perform a 3-point up/down calibration on a pressure transmitter; the quality of the test point data is equivalent to a 9-point calibration and if there is hysteresis, it will be detected.
  • Perform a simple 3-point calibration on temperature instrumentation. If calibrating a sensor in a dry block or bath, calibrating more than 3 points is a waste of time unless there is a high accuracy requirement or some other practical reason to calibrate with more points.
  • Calibrate a differential pressure flow transmitter with square-root extraction at an initial test point of 5.6 mA output (which is very close to zero at just 1% of the input span). Also, since a flow rate is being measured, the sequential output test points should be 8 mA, 12 mA, 16 mA and 20 mA, not based on even pressure input steps.
  • Whenever possible, try to establish the cost to perform a given calibration and include this in the decision process.
  • Focus on the most critical instruments, establish a good program and as the critical instruments get under control, time will become available to expand to the next level of criticality, and on and on.

Always keep in mind that instruments drift, some perform better than others. The performance tolerance set will ultimately determine the calibration schedule. Via documentation, if there will be capability to distinguish systematic error (drift) from random error ("noise") and a systematic pattern emerges, an optimal calibration interval can be determined. The best tolerance/interval combination will provide good control data for the best efficiency, quality and safety at the lowest calibration cost with minimal audit failures and/or headaches. Establishing best practices for calibration should be a continuous evolution. Technology is changing and calibration should evolve along with it. As discussed, there are many variables that go into proper calibration – by establishing base line performance, as observed in the operating environment, smart decisions can be made (and modified) to operate at optimal levels when it comes to calibration.

 

Download this article as a free pdf file by clicking the picture below:

Optimal Calibration Parameters for Process Instrumentation - Beamex white paper

 

Beamex CMX Calibration Management software

All graphics in this post have been generated using the Beamex CMX Calibration Management software.

 

Related blog posts

You might find these blog posts interesting:

 

Topics: Calibration process, calibration period, Calibration management

Weighing Scale Calibration Video

Posted by Heikki Laurila on Feb 22, 2019

Scale calibration video -How to calibrate weighing scales - Beamex blog post 

In this post we share an educational video on how to calibrate weighing scales.

The video goes through the various tests that should be performed during a scale calibration / scale recalibration.

These tests include:

  • Eccentricity test
  • Repeatability test
  • Weighing test
  • Minimum weight test

This video is based on our earlier blog post on weighing scale calibration. In that post you can find more details and also download the free white paper. To read that post, please click the below link:

Weighing scale calibration - How to calibrate weighing instruments

As any accurate measuring instruments, also weighing scales needs to be calibrated regularly using reference weights that are accurate, calibrated and traceable. In case of scales, there are a few standards specifying the calibration procedures (such as EURAMET Calibration Guide, NIST Handbook 44, OIML guides) .

 

Anyhow, enough talking, please watch the video below:

  

The mobile application used in this video is the Beamex bMobile calibration application, that can be used together with Beamex CMX Calibration Management Software for completely digital and paperless flow of calibration data.

I hope you find the video useful. As mentioned, for more detailed instructions, please visit our other scale calibration blog post on this topic. 

Please don't be afraid to subscribe to our blog and you will get an email notification when new posts are published (about once per month). The subscribe button is hidden in the top right corner.

 

 

 

Topics: weighing scale

Calibration in Times of Digitalization - a new era of production

Posted by Antonio Matamala on Jan 11, 2019

Calibration in Times of Digitalization - a new era of production. Beamex blog post.

This is the first blog post in the Beamex blog series "Calibration in Times of Digitalization" we will take a look into the future to give you an understanding of hot topics such as Industry 4.0 and Smart Factory, which are on everyone's minds. 

You have probably heard of terms such as Digitilization, Industry 4.0 and Smart Factory. But what do these really mean? Many users of these new technologies don’t yet fully grasp them. This is completely understandable and that's why we will gradually bring you closer to these topics in this Beamex blog series and help you enter the future well informed. 

Whether on television, in newspapers or on social media, hardly a day goes by without futuristic topics such as digitalization, big data, artificial intelligence and machine learning. But futuristic? Hasn't the future already arrived?

Download the white paper 

 

What will tomorrow's process industry look like?

Almost everyone today owns a smartphone that, without being aware of it, is equipped with a variety of sensors and communication technologies. Worldwide, there are now 5.1 billion users of mobile devices and the number is growing at an annual rate of 4%. Whether with a smartphone or directly on your computer, you are most likely a private individual who purchases comfortably from your sofa at home and are hardly surprised if underneath the product you have selected on the website says "Customers who bought this product were also interested in this product." At home, the future seems to have made its way long ago. Behind these platforms hide exactly those technologies that have made your private life easier and more convenient, and these exact technologies are currently making their way into the process industry within the framework of Industry 4.0. But now please ask yourself the question, what does your work environment look like? Are you still working with outdated technologies, or are you now also seeing a wave of modernization on the horizon?

The fact is, many employees are worried about terms such as machine learning, robotics and smart factories and what will happen tomorrow. You may also be afraid your future colleagues will be made of metal and make "blip" noises like in the award-winning Star Wars series. Or you have the idea that in the future you will hardly meet any people on the shop floor. To take away at least some of these worries in advance, the industrial production of tomorrow, and thus the factory of the future, will rely much more on the latest information and communication technologies than it does today. And no, the "factory of the future" cannot be imagined without people. They will even play a very important role. This should be reassuring, and it is, but there is a good chance that things will change in your environment in the future. Because digital technologies in modularly organized plants of the future will make processes flexible, the maintenance of such machines will be equally affected, as will the calibration of the growing number of process sensors that make Industry 4.0 possible in the first place. In other words, the digital factory will automatically lead to digital maintenance. And that could happen faster than you think. 

So you should start to proactively prepare for a digital future, starting by getting a picture of what will change. Because the way we work in maintenance today will change. That's for sure! But what we can tell you in advance: If you work in a calibration environment, then your work will gain in importance! 

How exactly which factors will play a role in the future will be explained step by step in this and other blogs. It is important that you first understand the technologies and important interrelationships that lie at the basis of these digital changes that you have probably already experienced through a wide variety of media channels.

 

Leaving the comfort zone step by step

There are trends and you should accept if you can't stop them. For example, when the first computers came onto the market, the then CEO of one of the leading technology companies made a forecast: "I think there's a world market for maybe five computers.” Maybe you're smiling while reading it because this forecast seems completely absurd to you. But at the beginning of the computer industry, nobody really knew where this new technology would take us. But the explosion of desktop computing has changed our lives a lot. And even if you think that the role of a computer in your private environment is limited, in our modern society nothing would work without computers. By the way, the same applies to the role of the Internet in our society. Was it not then to be expected that computers and above all Internet technologies would sooner or later find their way into the process industry?

In the Industry 4.0 era, production is closely interlinked with information and communication technologies, making it more flexible, efficient and intelligent. There is even talk of batch size 1, which might perhaps cause a question mark rather than an "A-ha" effect on you. Well, it's quite simple: with the expectation to meet the ever faster and more comprehensively changing customer requirements, customers expect individualized products that meet their requirements, but at prices that only series production can offer. How is that supposed to work? The answer is Industry 4.0. 

Industry 4.0 has set itself this goal and offers a variety of concepts, guidelines and technologies for building new factories or retrofitting existing ones, which, thanks to modular production lines equipped with flexible automation, IT and communication technologies, make it possible for customers to choose from a variety of variants at series production prices. In addition, the interconnection of the value chain extends far beyond the manufacturing company. The entire value chain, including suppliers, is also to be connected horizontally. Connectivity even goes one step further: thanks to connectivity, products that leave the factory should also report regularly to the manufacturer, e.g. for maintenance, and report on their status.

Calibration in Times of Digitalization - a new era of production. Beamex blog post.

Nevertheless, there are big differences between the time when this already mentioned CEO ventured to forecast the world computer market and the present time. Although the term Industry 4.0 - meaning the fourth industrial revolution - today causes similar social uncertainties as computers did at that time, it is decisive for the future of the process industry, especially for the manufacturing industry. Where the computer was a fundamental new technological invention, Industry 4.0 consists of composite technological components, some of which already exist as modules, but interoperability for fast and flexible "plug & play" deployment is still in its infancy. It should be noted that the first three industrial revolutions (1. steam engine, 2. assembly line , 3. automation and IT) were only subsequently classified and recognized as revolutions. In contrast, the so-called 4th Industrial Revolution is more like a controlled process, which from today's point of view takes place in the near future and is currently in the process of unfolding.

Sensors as key technology

The fact that Industry 4.0 is more like a controlled process than a wild revolution is of great benefit to many participants, even though it is not possible to say exactly where the journey will lead. What we can predict, however, is that tomorrow's world will be much more digital than it is today, and that will certainly affect your work as process workers, both in production and maintenance. If you are working in calibration, the following may be particularly important to you. For the factory of the future to exist at all, smarter objects (whether machines or end products) will need to be used to orchestrate manufacturing processes according to Industry 4.0 objectives. Objects without sensors are blind and unfeeling and can neither see how they have to act in connection with other modules nor can they report their own condition to top-level systems about, for example, the need for timely and optimized maintenance to prevent costly system failures. 

Sensors therefore play not only an important, but an essential role in the implementation of Industry 4.0. They form the interface between the digital and the real world. Data generated by these sensors must be correctly interpreted for further processing, and they must always be of excellent quality! Industry 4.0 also means that, in the future, sensors will be used far beyond the actual production processes. They also play a role in upstream, downstream and parallel sub-processes, such as predictive maintenance. One could therefore say that without the right sensors, all higher-level systems are blind, and with incorrect measurement data, wrong decisions are made. What should hardly surprise the maintenance staff is that the data quality of measurement data is based on a professional and prompt calibration of the sensors.

In the next blog we continue explaining the role of sensors and other technologies in Industry 4.0 and how these may affect your daily work.

Download the free white paper by clicking the picture below:

Calibration in the times of digitalization - Beamex white paper

 

About Beamex

Beamex has set itself the goal to find a better way to calibrate together with its customers. This also means taking a leading role in digitilization and Industry 4.0. If we have aroused your interest after reading this article, we would like to discuss this topic with you. We are very interested in exploring your current business processes regarding calibration to provide you with concepts for a better calibration in the digital era.
 

 

 

Topics: Calibration, Digitalization, Industry 4.0

Do more with less and generate ROI with an Integrated Calibration Solution

Posted by Heikki Laurila on Nov 19, 2018

CMX---Beamex-man-advert-photo-v1_2900x770px_v1 

Process instrument calibration is just one of the many maintenance related activities in a process plant. The last thing you want to do is to have your limited resources wasting time performing unnecessary calibrations or using time-consuming, ineffective calibration procedures.

Yet, you need to make sure that all critical calibrations are completed, ensuring the site stays running efficiently with minimal downtime, product quality is maintained, while the plant remains regulatory and safety compliant, and audit-ready.   

Most often you can’t just go and hire an army of external calibration specialists, so you need to get more done with your existing resources.

In this article, let’s examine at what an “Integrated Calibration Solution” is and how it can help you with your challenges – make your calibration process more effective, save time and money and improve the quality and integrity of the results. We will also discuss how it can quickly generate a great return your investment.

If any of that sounds interesting to you, please continue reading …

Or just download the free pdf white paper here: Download the white paper 

Improve the whole calibration process with an Integrated Calibration Solution

It is not enough to just buy some new calibration equipment or calibration software - that does not make your calibration process leaner and more effective. Instead, you should analyze at all the steps of your calibration process, and with the help of a suitable solution and expertise, find ways to improve the whole calibration process.

Let’s quickly look at a typical calibration process from the beginning to the end and explore how an integrated system could help:

ICS-summary

Typically, work is planned, and work orders are created in the maintenance management system. With an integrated solution, these work orders move automatically and digitally from the maintenance management system to the calibration software. There is no need to print work orders and distribute them manually.

The necessary calibration details are handled by the dedicated calibration software and it sends the work orders to the mobile calibration equipment. Again, this happens digitally.

While the technicians are out in the field performing the calibration activities, the results are automatically stored in the mobile devices, and users signs off the results using an electronic signature. From the mobile device the results are automatically transferred back to the calibration software to save and analyze.

Once the work orders are completed, the calibration software automatically sends an acknowledgement to the maintenance management software and work orders are closed.

So, the whole process is paperless and there is no need for manual entry of data at any point. This makes the process far more effective and saves time. This also helps minimize mistakes typically related with manual data entry, so it improves the quality and integrity of the calibration data. Furthermore, calibration results are safely stored and easily accessible in the calibration software for review for example in case of audits and for analysis purposes.

As mentioned, improving the calibration process is not just about buying some new equipment or software, but the project should also include improvement of the whole calibration process together with the new tools supporting it. Implementing a new process is a project with a formal implementation plan, ensuring that the new system/process is adopted by the users.

  

The key benefits of an integrated calibration solution

Here are listed some of the key benefits of an integrated calibration solution:

Improve operation efficiency – do more with less

  • Automate calibrations and calibration documentation. Eliminate all manual entry steps in the calibration process. Use multifunctional tools to carry less equipment in the field and lower equipment life-cycle costs

Save time and reduce costs – get a great ROI

  • With automated processes, get more done in shorter time. Don’t waste time on unnecessary calibrations. Let the data from the system guide you to determine the most important calibrations at appropriate intervals.

Improve quality

  • With electronic documentation, avoid all errors in manual entry, transcriptions and Pass / Fail calculations.

Guides non-experienced users

  • Let the system guide even your non-experienced users to perform like professionals.

Avoid system failures and out-of-tolerance risks

  • Use a calibration system that automatically ensures you meet required tolerance limits, to avoid system downtime and expensive out-of-tolerance situations.

Be compliant

  • Use a system that helps you meet regulations and internal standards of excellence.

Ensure safety

  • Ensure safety of the plant workers, and customers, using a calibration system that helps you navigate through safety critical calibrations.

Safeguard the integrity of calibration data

  • Use a calibration system that ensures the integrity of the calibration data with automatic electronic data storage and transfer and relevant user authorization.

Make audits and access data easy

  • Use a system that makes it easy to locate any record an auditor asks for.

 

What do the users say?

 Here are just a few testimonials on what the users have said about the Beamex Integrated Calibration Solution:

 

“With the Beamex integrated calibration solution, the plant has experienced a dramatic time savings and implemented a more reliable calibration strategy while realizing a 100% return on investment in the first year.

Using the Beamex tools for pressure calibrations has decreased the time it takes to conduct the calibration procedure itself in the field by over 80%.”

 

“Time is of the essence during an outage and the Beamex Integrated Calibration Solution allows technicians to maximize the amount of work accomplished in the shortest amount of time, while effectively performing vital tasks and managing workflows.”

 

“After the incorporation of Beamex’s integrated calibration solutions, calibrations that would take all day are now performed in a couple hours.”

 

“With this software integration project, we were able to realize a significant return on investment during the first unit overhaul. It’s unusual, since ROI on software projects is usually nonexistent at first.”

 

“After implementing the Beamex CMX calibration management system, GSK will be able to eliminate 21,000 sheets of printed paper on a yearly basis, as the entire flow of data occurs electronically, from measurement to signing and archiving.”

  • GlaxoSmithKline Ltd, Ireland

 

Related posts

If you like this post, you could like these posts too:

 

Check out the Beamex solution

Please check out the below link for the Beamex integrated Calibration Solution, which is a combination of calibration software, calibration hardware, various services and expertise:

Beamex Integrated Calibration Solution (Global web site)

Beamex Integrated Calibration Solution (USA web site)

 

Download the free white paper by clicking the picture below: 

New Call-to-action

 

 

Topics: Calibration process, Calibration management

How to calibrate temperature instruments [Webinar]

Posted by Heikki Laurila on Oct 18, 2018

How to calibrate temperature instruments - Beamex blog post 

In this blog post, I will share with you a two-part webinar series titled “How to calibrate temperature instruments”. We did this webinar together with our partner ISA (International Society of Automation).

The webinars covers some theory, many practical things, demonstrations of temperature instrument calibration and Questions & Answers sections.

More information on ISA can be found at https://www.isa.org/

These webinars include following experienced speakers:

  • Thomas Brans, Honeywell customer marketing manager .
  • Ned Espy has worked over 20 years with calibration management at Beamex, Inc. and also has calibration experience from his previous jobs.
  • Roy Tomalino has worked for 15 years at Beamex, Inc. teaching calibration management and also has prior calibration experience

Below you can find a short table of contents with the main topics, so it will be easier for you to see if there is something interesting for you. Of course, there is also a lot of other useful discussions in these webinars.

Please click the pictures below the table of content to view the webinar recording.

 

How to calibrate temperature instruments - Part 1 (1:39:05)

 

TimeTopic
0:00Introduction
4:30Presentation of speakers
7:00Presentation of agenda
8:00Process control framework
9:30Temperature
11:15Temperature units
15:20Thermocouple sensors
20:30Demonstration – calibration of a thermocouple transmitter
26:30Questions & Answers
28:50RTD basics
34:20Calibration basics
42:40Demonstration – calibration of an RTD transmitter
55:40Thermocouple and RTD basics
1:03:20Questions & Answers
1:39:05End of webinar

 

Watch the webinar (Part 1) by clicking the picture below:

How to calibrate temperature instruments, Part 1 - Beamex webinar

 

How to calibrate temperature instruments - Part 2 (1:21:33)

 

TimeTopic
0:00Introduction
0:10Presentation of speakers
2:20Presentation of agenda
3:10Temperature measurement in a heat exchanger
5:45Demonstration – calibration of a temperature sensor
9:00Loop basics
16:50Quick poll on loop testing
19:15Questions & Answers
32:00Temperature measurement in an autoclave
37:00Demonstration - calibration of a temperature sensor continues
41:30Measuring temperature in refinery
45:30Other applications - Infrared calibration
50:30Demonstration - wrap up
58:50Questions & Answers
1:21:33End of webinar

 

Watch the webinar (Part 2) by clicking the picture below:

How to calibrate temperature instruments, Part 2 - Beamex webinar

 

 

Other "temperature" blog posts

If you like temperature and temperature calibration related blog posts, you may these below blog posts also interesting:

 

Other "webinar" blog posts

These are some other webinar type of blog posts:

 

Beamex temperature offering

Beamex offers various tools for temperature calibration. Please check our offering at our web site in the link below:

Beamex temperature calibration products

 

 

 

Topics: Temperature calibration

Uncertainty components of a temperature calibration using a dry block

Posted by Heikki Laurila on Aug 23, 2018

Temperature dry block uncertainty components - Beamex blog post

 

In some earlier blog posts, I have discussed temperature calibration and calibration uncertainty. This time I will be covering the different uncertainty components that you should take into account when you make a temperature calibration using a temperature dry block. 

Making a temperature calibration using a dry block seems like a pretty simple and straight forward thing to do, however there are many possible sources for uncertainty and error that should be considered. Often the biggest uncertainties may come from the procedure on how the calibration is done, not necessarily from the specifications of the components.

Let’s turn the heat on!

If you just want to download this article as a free pdf file, please click the below button:

Uncertainty components of a temperature dry block

 

Table of contents

  • What is a dry block?
  • So, it's not a bath?
  • EURAMET Guidelines
  • Uncertainty Components
    • Internal or External reference sensor
    • 1. Internal reference sensor
    • 2. External reference sensor
    • 3. Axial temperature homogeneity
    • 4. Temperature difference between the borings
    • 5. Influence of loading
    • 6. Stability over time
    • 7. Don't be in a hurry
  • Summary

 

What is a “dry block”?

Let’s start anyhow by discussing what I mean with a “temperature dry block” in the article.

A temperature dry block is sometimes also called a dry-well or a temperature calibrator.

It is a device that can be heated and/or cooled to different temperature values, and as the name hints, it is used dry, without any liquids.

A dry block typically has a removable insert (or sleeve) that has suitable holes/borings for inserting temperature sensors into.

The dry block typically has its own internal measurement for the temperature, or you may use an external reference temperature sensor that you will insert into one of the holes.

Commonly a dry block has interchangeable inserts, so you may have several inserts, each being drilled with different holes, to suit for calibration of different sized temperature sensors.

It is very important in a dry block that the hole for the temperature sensor is sufficiently tight to enable low thermal resistance between the sensor and the insert. In too loose of a boring, the sensor stabilizes slowly or may not reach the temperature of the insert at all due to stem conduction.

Commonly, you would insert a temperature sensor in the dry block to be calibrated or calibrate a temperature loop where the temperature sensor is the first component in the loop.

The main benefits of a dry block are that it is easy to carry out in the field and there is no hot fluid that would spill when you carry it around. Also, a dry block will not contaminate the temperate sensors being calibrated.

Dry blocks are almost always used dry. In some very rare cases you may use some heat transfer fluids or pastes. In most cases you may damage the dry block if you use liquids.

Using oil or pastes also cause a potential health and fire risk if later used in temperatures higher than for example a flash point of the foreign substance. A 660 °C dry block that has silicon oil absorbed into its insulation may look neat outside, but it will blow out a noxious fumes when heated up. Calibration labs everywhere are probably more familiar with this than they would like to be…

As drawbacks for dry blocks, we could consider lower accuracy/stability than with a liquid bath and more difficult to calibrate very short and odd shaped sensors.

 

So, it’s not a “bath”?

Nope, I said “dry block”, didn’t I … ;-)

There are also temperature baths available, having liquid inside. The liquid is heated/cooled and the temperature sensors to be calibrated are inserted into the liquid. Liquid is also being stirred to get even temperature distribution in the liquid.

There are also some combinations of dry block and liquid bath, these are devices that typically have separate dry inserts and separate liquid inserts.

The main benefits of a liquid bath are the better temperature homogeneity and stability and suitability for short and odd shaped sensors.

While the drawbacks of liquid bath are the bigger size, heavier weight, working with hot liquids, poorer portability and they’re often slower than dry blocks.

Anyhow, in this article we focus on the temperature dry blocks, so let’s get back to them.

 

EURAMET Guidelines

Let’s take a quick look into Euramet guides before we proceed. And yes, it is very relevant for this topic.

EURAMET is the Regional Metrology Organisation (RMO) of Europe. They coordinate the cooperation of National Metrology Institutes (NMI) in Europe. More on Euramet at https://www.euramet.org/

Euramet has also published many informative guidelines for various calibrations.

The one that I would like to mention here is the one dedicated for temperature dry block calibration: EURAMET Calibration Guide No. 13, Version 4.0 (09/2017), titled “Guidelines on the Calibration of Temperature Block Calibrators”.

The previous version 3.0 was published in 2015. First version was published in 2007. That guideline was earlier called EA-10/13, so you may run into that name too.

The guideline defines a normative way to calibrate temperature dry blocks. Many manufacturers use the guideline when calibrating dry blocks and when giving specifications for their dry blocks.

To highlight some of the contents of the most recent version 4.0, it includes:

Scope

Calibration capability

Characterisation

  • Axial homogeneity
  • Temperature difference between borings
  • Effects of loading
  • Stability over time
  • Heat conduction

Calibration

  • Measurements
  • Uncertainties

Reporting results

Examples

You can download the Euramet guide pdf free here:

Guidelines on the Calibration of Temperature Block Calibrators

 

Uncertainty components

Let’s get into the actual uncertainty components. When you make a temperature calibration using a dry block, these are the things that cause uncertainty/error to the measurement results.

 

Internal or External reference sensor?

There are two principle ways to measure the true (correct) temperature of a dry block. One is to use the internal measurement using the internal reference sensor that is built in into the dry block, the other is to use an external reference sensor that is inserted into the insert boring/hole.

There are some fundamental differences between these two ways and they have a very different effect on the uncertainty, so let’s discuss these two options next:

 

1. Internal reference sensor

An internal reference sensor is permanently inserted into the metal block inside the dry block, it is typically close to the bottom part of the block and it is located in the metallic block surrounding the interchangeable insert.

So, this internal sensor does not directly measure the temperature of the insert, where you insert the sensors to be calibrated, but it measures the temperature of the surrounding block. Since there is always some thermal resistance between the block and the insert, this kind of measurement is not the most accurate one.

Especially when the temperature is changing, the block temperature normally changes faster than the insert temperature. If you make the calibration too quickly without waiting sufficient stabilization time, this will cause an error.

An internal reference sensor is anyhow pretty handy, as it is always readily inside the block, and you don’t need to reserve a dedicated hole in the insert for it.

The recalibration of the internal measurement is a bit difficult, as you need to send the whole dry block into recalibration.

An internal measurement sensor’s signal is naturally measured with an internal measurement circuit in the dry block and displayed in the block’s display. The measurement typically has an accuracy specification given. As discussed earlier, in practice this specification is only valid in stable conditions and does not include the uncertainties caused if the calibration is done too quickly or the sensors to be calibrated are not within the calibration zone at the bottom part of the insert, in a sufficiently tight boring.

 

Two internal refernce sensors at diff height

 

The above left side pictures illustrate how the internal reference sensor is typically located in the temperature block, while the sensor to be calibrated is inserted into the insert. If the sensor to be calibrated is long enough and reaches the bottom on the insert, the boring is tight enough, and we waited long enough for stabilization, we can get good calibration with little error.

In the right side picture we can see what happens if the sensor to be calibrated is too short to reach to the bottom of the insert. In this case, the internal reference sensor and the sensor to be calibrated are located at different heights and are measuring different temperatures, causing a big error to the calibration result .

 

2. External reference sensor

The other way is to use an external reference sensor. The idea here is that you insert a reference sensor into a suitable hole in the insert, while you enter the sensors to be calibrated in the other holes in the same insert.

As the external reference sensor is inserted into the same metal insert with the sensors to be calibrated, it can more precisely measure the same temperature as the sensors to be calibrated are measuring.

Ideally, the reference sensor would have similar thermal characteristics as the sensors to be calibrated (same size and thermal conductance). In that case, as the insert temperature changes, the external reference sensor and the sensor to be calibrated will more accurately follow the same temperature changes.

The external reference sensor naturally needs to be measured somehow. Often a dry block has internal measurement circuitry and a connection for the external reference sensor or you can use an external measurement device. For uncertainty, you need to consider the uncertainty of the reference sensor and the uncertainty of the measurement circuitry.

Using an accurate external reference sensor results in a more accurate calibration with smaller uncertainty (compared to using an internal reference sensor). So, it is highly recommended if you want good accuracy (small uncertainty).

An external reference sensor also promotes reliability. If the internal and external sensor readings differ a lot, it’s a warning signal to the user that something is probably wrong and the measurements may not be trustworthy.

For recalibration, in the case of an external reference sensor, you can send just the reference sensor for recalibration, not the whole dry block. In that case, you naturally will not have the dry block’s functionalities being checked (and adjusted if necessary), like the axial temperature homogeneity, for example.

If you don’t send the dry block for calibration, be sure to measure and record the axial gradient regularly yourself, as it’s typically the biggest uncertainty component also when the external reference sensor is used. Otherwise a strict auditor may profoundly question the traceability your measurements.

 

 

Three sensor pairs at diff heights 

The above picture illustrate how the external reference sensor and the DUT (Device Under Test) sensor are both located in the insert. The first picture shows the case when both sensors reach the bottom of the insert, resulting in best calibration results.

The second picture shows what happens if the reference sensor and DUT sensor are at different depth. This will cause a big temperature difference between the two sensors and will result in error in calibration.

The third picture shows an example where the DUT sensor is short, and the reference sensor has been correctly positioned in the same depth as the DUT sensor. With this you can get the best possible calibration result, although the homogeneity of the insert is not very good at the upper part of the insert.

So, if the sensors are located at different heights, that will cause additional error, but using an external reference sensor the error is still anyhow typically smaller than when calibrating a short sensor using the internal reference sensor.

 

3. Axial temperature homogeneity

Axial homogeneity (or axial uniformity) refers to the difference in temperature along the vertical length of the boring in the insert.

For example, the temperature may be slightly different in the very bottom of the boring in the insert, compared to the temperature a little higher in the boring.

Typically, the temperature will be different in the very top of the insert, as the temperature is leaking to the environment, if the block’s temperature is very different than the environmental temperature.

Some temperature sensors have the actual measurement element being shorter and some longer. Also, some have the element closer to the tip than others. To assure that different sensors are in the same temperature, the homogenic zone in the bottom of the block’s insert should be long enough. Typically, the specified area is 40 to 60 mm.

A dry block should have sufficient area in the insert bottom within which the temperature homogeneity is specified. During a calibration of the block, this may be calibrated by using two high-accuracy reference sensors at different heights or using a sensor with a short sensing element that is gradually lifted higher from the bottom. This sort of short sensing element sensor needs to be stable but does not necessarily be even calibrated because it’s used just for measuring temperature difference at different heights. If needed, the axial temperature gradient can typically be adjusted in a dry block.

If you have a short (sanitary) temperature sensor that does not reach all the way to the bottom of the boring in the insert, then things will get a bit more complicated. In that case, the internal reference measurement in the dry block cannot really be used, as is typically in the bottom of the block. An external reference sensor should be used and it should have the center of the measurement zone inserted as deep as the center of the measurement zone of the short sensor to be calibrated.

Often, this means that a dedicated short reference sensor should be used, and inserted into the same depth as the short sensor to be calibrated. It gets even more difficult if the short sensor to be calibrated has a large flange as that will soak up temperature from the sensor.

Summary - During the calibration you should ensure that your reference sensor is inserted to the same depth as the sensor(s) to be calibrated. If you know the lengths and the locations of the sensing elements, try to align the centers horizontally. If that is not possible, then you need to estimate the error caused by that. You should use an external temperature sensor, if the accuracy requirements of the calibration are higher, or if the sensor to be calibrated is not long enough to reach the bottom on the insert hole.

 

Axial temp homogeneity with two pics

 

The above picture illustrate what the “axial temperature homogeneity” means. Typically, a dry block has a specified area in the bottom that has a homogenic temperature, but as you start to lift the sensor to be calibrated higher, it will not be in the same temperature anymore. 

 

4. Temperature difference between the borings

As the title hints, the temperature difference between the borings, sometimes referred as “radial uniformity”, is the temperature difference between each boring (hole) in the insert. Although the insert is made of metal compounds and has a good thermal conductivity, there can still be a small difference between the borings, especially the opposite ones.

In practice, when you have two sensors in the insert installed in the different borings, there can be a small temperature difference between them.  

The difference can be caused by the insert touching the block more on one side or the insert being loaded unequally (more sensors on one side, or thicker sensors in one side than on the other side). Of course, the heaters and Peltier elements, located on different sides, have their tolerances too.

The temperature difference between the borings in normally relatively small in practice.

Summary – the specification of the temperature difference between borings should be taken into account.

 

Difference between borings - Uncertainty components of a temperature calibration using a dry block. Beamex blog article.

 

 

5. Influence of loading

 There is always some heat conducted through the sensors to the environment (stem conductance) if the block’s temperature differs from the environmental temperature.

If there are several sensors installed in the insert, there will be more temperature “leaking” to the environment. Also, the thicker the sensors are, the more temperature leakage there will be.

The bigger the temperature difference between the insert and the environment temperature, the bigger the leakage will be.

For example, if you have the dry block at high temperature, this temperature leakage will cause the insert to cool down because of the loading. The top of the insert will lose more temperature than the bottom of the insert and the top becomes cooler.

The deeper the insert is, the less loading effect there will be. Also, some dry blocks have two or more heating/cooling zones: one in the bottom, one in center and one in the top of the block. This will help to compensate the loading effect (e.g. the top heating can heat more to compensate the top of the insert to cool down).

If you use the internal reference measurement of the dry block, there will typically be a larger error since the internal reference is not in the insert but is in the bottom of the surrounding block. Therefore, the internal reference sensor does not see this effect of loading very well.

An external reference sensor can better see the effect of loading, as it is in the insert and it will also have the same change in the temperature. The error caused by the loading effect is much smaller when external reference sensors is used (compared to using internal reference sensor), and the results are better.

Summary – check out the loading effect of your dry block in your application (how many sensors, which type of sensor) and use that as one uncertainty component.

 

Stem conductanse two pics

 

The above pictures illustrate the stem conductance caused by the sensors leaking the temperature to the environment. In the second picture there are several sensors inserted, so the stem conductance/leakage is larger.

 

 

6. Stability over time

 Stability over time describes how well the temperature remains stable over a longer period. The temperature needs to be stable for certain time, as the different sensors may have different thermal characteristics and it takes different time for different sensors to stabilize. If the temperature is constantly creeping up and down, the different sensors may read different temperatures.

In case there is some fluctuation in the temperature, an external reference sensor will anyhow result in more accurate results, compared to the use of an internal reference sensor.

Often a dry block manufacturer has given a stability specification, for example for a 30 minute period.

 

7. Don’t be in a hurry!

 It’s good to remember the fact that a temperature sensor will always measure only its own temperature. So, it does not measure the temperature where it is installed, but it will measure its own temperature.

Also, temperature changes pretty slowly and it takes some time before all parts of the system have stabilized to the same temperature, i.e. system has reached equilibrium.

If you make a temperature calibration with a dry block too fast, that will be the biggest source of uncertainty! 

So, get to know your system and the sensors you calibrate and experiment to see how long time is enough for sufficient stabilization.

Especially if you use the internal reference sensor, it will reach the set temperature much faster than the sensors to be calibrated located in the insert. That is because the internal sensor is in the block that is heated/cooled, and the sensors to be calibrated are in the insert. Taking the results too soon can cause a big error.

In case of an external reference sensor, the need for stabilization depends on how different your reference sensor is compared to your sensors to be calibrated. If they have different diameter, they will most likely have different stabilization time. Anyhow, using an external reference sensor will be much more accurate than internal one, in case you don’t wait long enough for stabilization.

Often a dry block will have a stability indicator, but that may be measuring the stability of the internal reference sensors, so don’t trust only on that one.

Summary – shortly, if you do the temperature calibration too fast, the results will be terrible.

 

Temperature sensor stability - Uncertainty components of a temperature calibration using a dry block. Beamex blog article.

 

The above picture illustrates an (exaggerated) example where the temperature set point has been first 10°C and at the 5 minutes mark it has been changed to 150°C (blue line represents the set point).

There have been two sensors in the dry block – a reference sensor and a sensor to be calibrated.

We can see that the Sensor 1 (red line) changes much faster and reaches the final temperature at about 11 minutes point. The Sensor 2 (green line) changes much slower and it reaches the final temperature at around the 18 minutes mark.

The Sensor 1 is our reference sensor and the Sensor 2 is the sensor to be calibrated. We can see that if we read the temperatures too early at 10 minutes mark, we will get a huge error (about 85°C) in our results. Even if we take the readings at the 15 minutes mark, we still have around 20°C difference.

So, we should always make sure that we wait long enough to make sure that all sensors are stabilized to the new temperature, before we read the readings.

 

Summary

Making a temperature (sensor) calibration using a dry block seems pretty simple and straight forward thing to do. But there are anyhow many possible sources for uncertainty and error that should be taken into account.

Often the biggest uncertainties may come from the procedure on how the calibration is done, not necessarily from the specifications of the components.

For example, you may have an accurate dry block that has combined total uncertainty being 0.05°C and a high-quality reference sensor with uncertainty of 0.02°C. But anyhow calibrating a temperature sensor with these devices can have an uncertainty of several degrees, if it is not made properly.

That is one reason I don’t like the discussion of TAR (Test Accuracy Ratio) as it does not take into account all the uncertainties caused by the calibration procedure, it only uses the accuracy specifications.

I hope these considerations listed in the article help you to realize the possible sources of uncertainty and also how to minimize them.

 

If you want to download this article as a free pdf file, please click the below button: 

Uncertainty components of a temperature dry block

 

Related blog posts

The main topics discussed in this article are temperature calibration and calibration uncertainty. Other blog posts on these topics, that you could be also interested in, are for example following:

 

Beamex offering

The Beamex MC6-T is a revolutionary temperature calibration tool, and it provides an excellent accuracy & uncertainty for temperature calibration. Please click the picture below to learn more:

Beamex MC6-T temperature calibrator

 

Beamex offers also various other temperature calibration products, please check our offering in the below link:

Beamex temperature calibration products

 

Please subscribe & suggest topics

If you like these blog articles, please subscribe to this blog by entering your email address to the "Subscribe" box on the upper right-hand side. You will be notified by email when new articles are available, normally about once in month.

Also, please feel free to suggest good and interesting topics for new articles!

 

  

Topics: Temperature calibration, calibration uncertainty

AMS2750E Heat Treatment Standard and Calibration

Posted by Heikki Laurila on Jun 20, 2018

AMS2750 Heat treatment furnace - Beamex blog post

Update July 2020: Please note that a new F version (AMS2750F) has been released in June 2020.  

 

In this blog post, I will take a short look at the AMS2750E standard, with a special focus on the requirements set for calibration, calibration accuracy and test/calibration equipment.

The AMS2750E is predominantly designed for heat treatment in the aerospace industries. Heat treatment is an essential process for many critical parts of an airplane, so it is understandable that there are tight regulations and audit processes set.

While the results and success of some other industrial processes can be relatively easily measured after the process, this is not the case in a heat treatment process. Therefore, very tight control and documentation of the heat treatment process is essential to assure the quality of the end products.

Download White Paper: AMS2750E Heat Treatment Standard and Calibration  

 

AMS2750 standard

As mentioned, the AMS2750E is a standard for the heat treatment. The “AMS” name in the standard is an abbreviation of “Aerospace Materials Specifications”. The standard is published by SAE Aerospace, part of SAE International Group. The first version of the AMS2750 standard was published in 1980. Followed by revisions: revision A in 1987, B also in 1987, C in 1990 and D in 2005.  The current revision AMS2750E was published in 2012.

The AMS2750 standard was initially developed to provide consistent specifications for heat treatment through the aerospace supply chain. The use of the standard is audited by PRI (Performance Review Institute) for the Nadcap (National Aerospace and Defense Contractors Accreditation Program). Prior to Nadcap, aerospace companies each audited their own suppliers, so there was a lot of redundancy and duplication of efforts. In 1990, the PRI was established to administer the Nadcap program.

 

AMS2750E scope

According to the standard itself, the scope of the AMS2750E standard is the following:

"This specification covers pyrometric (high temperature) requirements for thermal processing equipment used for heat treatment. It covers temperature sensors, instrumentation, thermal processing equipment, system accuracy tests, and temperature uniformity surveys. These are necessary to ensure that parts or raw materials are heat treated in accordance with the applicable specification(s)."

 

Why for heat treatment?

In some industrial processes, it is relatively easy to measure and check the quality of the final product and judge if the product fulfills the requirements after the process is complete. You may be able to simply measure the end product and see if it is good or not.

In other processes where it is not possible/easy/practical to measure the quality of the final product you need to have a very tight control and documentation of the process conditions, in order to be sure that the final product is made according to the requirements.

It is easy to understand that heat treatment is a process where you need to have a very good control of the process in order to assure that you get the required end product, especially since the products are mostly used by the aerospace industry.  

 

Who is it for?

The AMS2750E is predominantly designed for the aerospace industries. But the same standards and processing techniques can be used within any industry which requires control of the thermal processing of raw materials and manufactured components, such as automotive, rail and manufacturing.

 

But what is the CQI-9?

The CQI-9 is a similar set of requirements for heat treatment, mainly aimed for the automotive industry. The first edition of CQI-9 was published in 2006. The CQI-9 “Heat Treatment System Assessment” is a self-assessment of the heat treatment system, published by AIAG (Automotive Industry Action Group). More details about CQI-9 maybe in an other post later on...

 

Test instruments and calibration

Let’s discuss Test Instruments (calibrators) and what AMS2750E says about them.

traceable calibration of different levels of measurement instruments is obviously required. The higher level standards are typically calibrated in an external calibration laboratory. The process measurements are calibrated internally using “field test instruments”.

Metrological Traceability is often described as traceability pyramid, or as a traceability chain, see below:

 

Traceability pyramid:

Metrological traceability pyramid - Beamex     

 

Traceability chain:

Metrological traceability chain - Beamex

 

To learn more about the metrological traceability in calibration read the following blog post:

Metrological Traceability in Calibration – Are you traceable?

 

The magical "Table 3" 

In the Table 3 in the AMS2750E standard, there are different specifications for the test standards and test equipment/calibrators. The different levels of instruments are classified as follows:

  • Reference standard
  • Primary standard
  • Secondary standard instruments                                                 
  • Secondary standard cell
  • Field test instrument
  • Controlling, monitoring or recording instruments

For each instrument class, there are specifications for the calibration period and calibration accuracy. If we think about calibrators/calibration equipment, it is typically used as “field test instrument” or sometimes as “secondary standard instrument” and following are said about those:

Secondary standard instrument

  • Limited to laboratory calibration of field test instruments, system accuracy test sensors, temperature uniformity survey sensors, load sensors and controlling, monitoring or recording sensors 

Field test instrument

  • Calibration of controlling, monitoring, or recording instrument, performance of system accuracy tests, and temperature uniformity surveys

 

AMS2750E accuracy requirements

AMS2750E also specifies the calibration period and accuracy requirements for the different levels of instruments, below is what is said about the secondary standard instrument and field test instrument:

 AMS2750E heat treatment Table 3 - Beamex

 

Sometimes it is easier to look at a visual, so let's look at this required calibration accuracy graphically for field test instrument” and “secondary standard instrumentAnd as the Centigrade and Fahrenheit are different, below is a graph of both for your convenience:

 

AMS2750 calibration accuracy - Beamex

 

AMS2750 calibration accuracy - Beamex

 

Contradiction with different thermocouples types and accuracy

The AMS2750E standard specifies different thermocouple types for different usage. Types B, R and S are included for more demanding use, while types J, E, K, N, T are also included in the standard.

Anyhow, the standard has the same accuracy specification regardless of the thermocouple type. This is a slightly strange requirement, as different thermocouples have much different sensitivities.

In practice, this means that a test field instrument (calibrator) normally has a specification for millivoltage, and when this mV accuracy is converted to temperature it means that the calibrator normally has different specifications for different thermocouple types. Some thermocouple types have very low sensitivity (voltage changes very little as temperature changes), especially in the lower end.

For example - a calibrator can have an electrical specification of 4 microvolts at 0 V. With a K type, this 4 µV equals a temperature of 0.1 °C (0.2 °F), but for a S type, this equals 0.7°C (1.3°F), and for a B type it equals almost 2°C (3.6 °F). Therefore, calibrators normally have very different accuracy specifications for different thermocouple types.

So the standard having the same accuracy regardless of the thermocouple type is a bit strange requirement.

To illustrate the different sensitivities of different thermocouple types, please see the graphics below. The graph shows what kind of thermovoltage (Emf) is generated in different temperature by different thermocouple types:

 

Thermocouple emf voltage versus temperature - Beamex

 

To learn more about thermocouples, different thermocouple types and thermocouple cold junction compensation, please read this blog post:

Thermocouple Cold (Reference) Junction Compensation

 

AMS2750E contents in a nutshell

Let’s take a brief look at the contents of the AMS2750E standard and further discuss a few key points in the standard.

The AMS2750E standard starts with sections:

  • 1. Scope
  • 2. Applicable documents

Chapter 3 “Technical Requirements” of AMS2570E includes the following key sections. (These sections are discussed in more details in the next chapters):

  • 3.1  Temperature sensors
  • 3.2. Instrumentation
  • 3.3. Thermal processing equipment
  • 3.4. System Accuracy Tests (SAT)
  • 3.5. Furnace Temperature Uniformity Survey (TUS)
  • 3.6. Laboratory furnaces
  • 3.7. Records
  • 3.8. Rounding

The remaining sections are:

  • 4. Quality assurance provisions
  • 5. Preparation for delivery
  • 6. Acknowledgement
  • 7. Rejections
  • 8. Notes

 

3.1 Temperature sensors

Section 3.1 discusses temperature sensors. Some key bullets from that section:

  • The AMS2750E standard specifies the thermocouple sensors to be used, as well as the sensor wire types.
  • The voltage to temperature conversion standard to be used (ASTM E 230 or other national standards).
  • Correction factors may be used to compensate for the errors found in calibration.
  • The temperature range for the sensors used.
  • Allowance to use wireless transmitters.
  • Contents of a sensor calibration certificate.
  • The max length of sensor wire/cable.
  • The max number of usage of thermocouples in different temperatures.
  • Types of thermocouple sensors to be used, the use for thermocouples (primary calibration, secondary calibration, sensor calibration, TUS, SAT, installation, load sensing), calibration period for thermocouples, and maximum permitted error.

 

3.2 Instrumentation

Section 3.2 covers the instrumentation that the sensors are used with. This includes control, monitoring, recording, calibration, instrumentation, etc.

  • Instruments need to be traceably calibrated.
  • Minimum resolution/readability of test instruments (1 °F or 1 °C).
  • Specifications for electronic records.
  • Contents of calibration sticker:
    • Date, due date, performed by, any limitations
  • Contents of calibration record:
    • Instrument identification, make and model, standard(s) used, calibration method, required accuracy, as found and as left data of each calibration point, offset, found/left, sensitivity, statement of acceptance or rejection, any limitations or restrictions, calibration date, due date, performed by, calibration company, signature, quality, organization approval.

 

3.3 Thermal processing equipment

Section 3.3 discusses the furnace classification and the temperature uniformity requirements in each class. Going from class 1 having uniformity requirement of ±5°F / ±3 °C, to class 6 with ±50 °F / ±28 °C.

 

3.4 System accuracy test