Inst ToolsInst ToolsInst Tools
  • Courses
  • Automation
    • PLC
    • Control System
    • Safety System
    • Communication
    • Fire & Gas System
  • Instrumentation
    • Design
    • Pressure
    • Temperature
    • Flow
    • Level
    • Vibration
    • Analyzer
    • Control Valve
    • Switch
    • Calibration
    • Erection & Commissioning
  • Interview
    • Instrumentation
    • Electrical
    • Electronics
    • Practical
  • Q&A
    • Instrumentation
    • Control System
    • Electrical
    • Electronics
    • Analog Electronics
    • Digital Electronics
    • Power Electronics
    • Microprocessor
Search
  • Books
  • Software
  • Projects
  • Process
  • Tools
  • Basics
  • Formula
  • Power Plant
  • Root Cause Analysis
  • Electrical Basics
  • Animation
  • Standards
  • 4-20 mA Course
  • Siemens PLC Course
Reading: Calibration : Up-tests and Down-tests
Share
Notification Show More
Font ResizerAa
Inst ToolsInst Tools
Font ResizerAa
  • Courses
  • Design
  • PLC
  • Interview
  • Control System
Search
  • Courses
  • Automation
    • PLC
    • Control System
    • Safety System
    • Communication
    • Fire & Gas System
  • Instrumentation
    • Design
    • Pressure
    • Temperature
    • Flow
    • Level
    • Vibration
    • Analyzer
    • Control Valve
    • Switch
    • Calibration
    • Erection & Commissioning
  • Interview
    • Instrumentation
    • Electrical
    • Electronics
    • Practical
  • Q&A
    • Instrumentation
    • Control System
    • Electrical
    • Electronics
    • Analog Electronics
    • Digital Electronics
    • Power Electronics
    • Microprocessor
Follow US
All rights reserved. Reproduction in whole or in part without written permission is prohibited.
Inst Tools > Blog > Calibration > Calibration : Up-tests and Down-tests

Calibration : Up-tests and Down-tests

Last updated: July 22, 2019 4:40 pm
Editorial Staff
Calibration
1 Comment
Share
7 Min Read
SHARE

Calibration : Up-tests and Down-tests

Instruments are calibrated to measure the process variables in a fixed range of scale. All instruments are assumed to be Linear if the physical quantity or process variable and the resulting output readings of the instrument have a linear relationship.

Contents
Calibration : Up-tests and Down-testsPurpose of instrument calibrationWhen do instruments need to be calibrated?Basic steps for correcting the instrument for biasCalibration : Up-tests and Down-tests

The relation between input and output will be either linear, square root, angular, or any custom algorithm. The instrument output has to display the readings in terms of the process variable units.

The manufacturer calibrates by comparing the output of the instrument with respect to a standard input. Having obtained such an instrument, which has been marked and calibrated by the manufacturer, the user has to periodically calibrate the instrument to see whether it is working within the prescribed limits.

In order to calibrate an instrument, we need another standard input which can be measured ten times more accurately than with the instrument under calibration. The standard input is varied within the range of measurement of the instrument to be calibrated.

Based on the standard input and the values obtained from the instrument, one can calibrate the instrument.

Purpose of instrument calibration

Calibration refers to the act of evaluating and adjusting the precision and accuracy of measurement equipment.

Instrument calibration is intended to eliminate or reduce bias in an instrument’s readings over a range for all continuous values.

  • Precision is the degree to which repeated measurements under unchanged conditions show the same result
  • Accuracy is the degree of closeness of measurements of a quantity to its actual true value.

For this purpose, reference standards with known values for selected points covering the range of interest are measured with the instrument in question.

Then a functional relationship is established between the values of the standards and the corresponding measurements. There are two basic situations:

  • Instruments which require correction for bias: The instrument reads in the same units as the reference standards. The purpose of the calibration is to identify and eliminate any bias in the instrument relative to the defined unit of measurement.
  • Instruments whose measurements act as surrogates for other measurements: The instrument reads in different units than the reference standards.

Also Read : As-found and As-left Documentation during Calibration

When do instruments need to be calibrated?

  • Indicated by manufacturer
    • Every instrument will need to be calibrated periodically to make sure it can function properly and safely. Manufacturers will indicate how often the instrument will need to be calibrated.
  • Before major critical measurements
    • Before any measurements that requires highly accurate data, send the instruments out for calibration and remain unused before the test.
  • After major critical measurements
    • Send the instrument for calibration after the test helps user decide whether the data obtained were reliable or not. Also, when using an instrument for a long time, the instrument’s conditions will change.
  • After an event
    • The event here refers to any event that happens to the instrument. For example, when something hits the instrument or any kinds of accidents that might impact the instrument’s accuracy. A safety check is also recommended.
  • When observations appear questionable
    • When you suspect the data’s accuracy that is due to instrumental errors, send the instrument to calibrate.
  • Per requirements
    • Some experiments require calibration certificates. As per our plant requirements.

Basic steps for correcting the instrument for bias

The calibration method is the same for both situations stated above and requires the following basic steps:

  1. Selection of reference standards with known values to cover the range of interest.
  2. Measurements on the reference standards with the instrument to be calibrated.
  3. Functional relationship between the measured and known values of the reference standards (usually a least-squares fit to the data) called a calibration curve.
  4. Correction of all measurements by the inverse of the calibration curve.

Some people mix up field check and calibration. Field check is when two instruments have the same reading; this does not mean they are calibrated; it may be that both instruments are wrong.

Let’s use thermometer as an example; if a thermometer always read .25 degree higher, this error can not be eliminated by taking averages, because this error is constant.

The easiest way to determine if it is accurate and fix it is to send the thermometer to a calibration laboratory. Another way to reveal constant errors is to have one or more similar thermometers.

One thermometer is used and then replaced by another thermometer. If readings are divided among two or more thermometers, inconsistencies among the thermometers will ultimately be revealed.

Also Read : Smart Transmitter Calibration

Calibration : Up-tests and Down-tests

It is not uncommon for calibration tables to show multiple calibration points going up as well as going down, for the purpose of documenting hysteresis and deadband errors.

Note the following example, showing a transmitter with a maximum hysteresis of 0.313 % (the offending data points are shown in bold-faced type):

Calibration Sheet

Note again how error is expressed as either a positive or a negative quantity depending on whether the instrument’s measured response is above or below what it should be under each condition. The values of error appearing in this calibration table, expressed in percent of span, are all calculated by the following formula:

Calibration Error Formulae

In the course of performing such a directional calibration test, it is important not to overshoot any of the test points. If you do happen to overshoot a test point in setting up one of the input conditions for the instrument, simply “back up” the test stimulus and re-approach the test point from the same direction as before.

Unless each test point’s value is approached from the proper direction, the data cannot be used to determine hysteresis/deadband error.

Also Read : Error sources creating uncertainty in Calibration

Reference : chem.libretexts.org

Credits : by Tony R. Kuphaldt – under Creative Commons Attribution 4.0 License

Don't Miss Our Updates
Be the first to get exclusive content straight to your email.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
You've successfully subscribed !

Continue Reading

Calibration of Measuring Instruments – Significance, Costs & Risks
Closed tank DP Level Transmitter with wet leg elevation zero direct mount Calibration
Current to Pressure (I/P) Converter Calibration Procedure
Difference between Repeatability and Reproducibility
Level Transmitter Calibration Procedure
Sand Probe Functional Testing
Share This Article
Facebook Whatsapp Whatsapp LinkedIn Copy Link
Share
1 Comment
  • Y akhil kumar says:
    March 6, 2022 at 9:52 pm

    If digital pressure gauge is calibrated in bars. May is use the same gauge for values in kg/cm2.

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

128.3kFollowersLike
69.1kFollowersFollow
208kSubscribersSubscribe
38kFollowersFollow

Categories

Explore More

Oxidation-Reduction Potential (ORP) Sensor Calibration Procedure
H & B Gas Analyzer Principle and Calibration Procedure
Specific Conductance Calibration Procedure
Error sources creating uncertainty in Calibration
Smart Transmitters LRV & URV
What is Field Transmitter Damping ?
Pig Signalling Switches Functional Testing
Dissolved oxygen probe calibration procedure

Keep Learning

DP Diaphragm Capillary Seal Level instrument

DP Diaphragm Capillary Seal Level instrument Calibration Procedure

Vibration Probe Field Setting Procedure

How to do calibration checks of vibration Probe, extension cable and vibration monitor

Types of Calibration

Types of Calibration

Common Mistakes in Dead Weight Tester

Common Mistakes to Avoid When Using Dead Weight Tester

The Principle of Dead Weight Pressure Tester

The Principle of Dead Weight Pressure Tester

Pressure Safety Valves Functional Testing

Pressure Safety Valves Functional Testing

Turbidity Sensor Calibration Procedure

Turbidity Sensor Calibration Procedure

Selection of Master Instrument for Calibration

Selection of Master Instrument for Calibration

Learn More

Conductivity Meter for Medium Concentration

Measurement of Impurities in Water and Steam – Power Plant

Working of Woodward Governor 505

Overview of Woodward Governor 505

Battery in Vehicle

Why is a Battery used in a Vehicle?

Compressor Did Not Trip

RCA – Compressor Did Not Trip on Suction Pressure Low Low

Power Factor Meter Principle

Power Factor Meter Principle

What is Rapid Spanning Tree Protocol

What is Rapid Spanning Tree Protocol? – RSTP Network

Infrared Flue Gas Analyser

What is Infrared Flue Gas Analyzer? – Working Principle

Input function blocks

FF Function Blocks

Follow US
All rights reserved. Reproduction in whole or in part without written permission is prohibited.
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?