Measurement Technologies: by Hassanain Ghani Hameed
Measurement Technologies: by Hassanain Ghani Hameed
Measurement Technologies: by Hassanain Ghani Hameed
By
Hassanain Ghani Hameed
Lecture Five
Calibration
5.1- Introduction
Every measuring instrument is subject to ageing as a
result of mechanical, chemical or thermal stress and
thus delivers measured values that change over time.
This cannot be prevented, but it can be detected in
good time by calibration.
Calibration is the process of making an adjustment or
marking a scale so that the readings of an instrument
agree with the accepted and the certified standard. The
standard of device with which comparison is made is
called standard instrument. The instrument which is
unknown and is to be calibrated is called test
instrument. Thus in calibration, test instrument is
compared with the standard instrument.
There are two fundamental methodologies of
calibration. They are:
Direct comparison
Indirect comparison
5.2 Direct Comparison:
concentrations
Figure (5-11) below illustrates the confidence interval for
the regression line. The interval is represented by the
curved lines on either side of the regression line and gives
an indication of the range within which the ‘true’ line might
lie. Note that the confidence interval is narrowest near the
center (the point x, y ) and less certain near the extremes.
In addition, it is possible to calculate a confidence
interval for values predicted using the calibration
function. This is sometimes referred to as the ‘standard
error of prediction’ and is illustrated in Figure (5-12).
The prediction interval gives an estimate of the
uncertainty associated with predicted values of x.
5.4.1.8 Standard error of prediction worked example
The table and Figure below show a set of calibration
data which will be used to illustrate the calculation of a
prediction interval.
The data required to calculate a prediction interval are
shown as follows:
The residual standard deviation is calculated as:
= 0.53%.
The uncertainty in predicted values can be reduced by
increasing the number of replicate measurements (N)
made on the test sample. Table below shows how Sxo
changes as N is increased.
References
• IAEA - CANDU I & C SNERDI, Shanghai,
INSTRUMENTATION EQUIPMENT
• Guide for the Use of the International System
of Units (SI). NIST Special Publication 811 2008
Edition