What does calibration mean?
Calibration in the field of instrumentation means the detection of measurement deviations of an instrument. Calibration does not include technical interventions on the instrument such as zero point correction, span or linearity, etc. In indicating instruments, the deviation between indication and correct value is calculated.
What does adjust mean?
Adjusting means to set or synchronize an instrument to reduce deviations as much as possible or to keep all deviations within a given error limit. Adjustment therefore includes an intervention in order to modify the instrument constantly, e.g. by moving the indicator needle.
What is maximum permissible error?
The extreme value of measurement error, with respect to a known reference magnitude value, permitted by specifications or regulations for a particular subject.
What is accuracy class?
A class of measuring instruments or systems that satisfy specified metrological requirements intended to keep measurement errors or instrumental uncertainties within specified limits under given operating conditions.
An accuracy class is usually indicated by a number (1% F.S.) or a symbol adopted by convention (Grade A), published in a standard or technical document.
Why should instruments be calibrated?
For years, the quality system according to DIN EN ISO 9000 has been applied to ensure the continuous quality of products. This standard requires the verification of all quality-relevant characteristics and the use of calibration devices with traceability. Calibration ensures internationally comparable results and reduces product liability risks – an important prerequisite for competitiveness in the markets of the future.
How is a calibration interval determined?
To ensure the permanence of correct measurements, testing devices must be monitored and calibrated at regular intervals. The user of a calibration device assumes responsibility for determining the calibration interval. There are numerous factors to consider in determining the calibration interval. These are, for example, measurement uncertainties, regulations and directives, application conditions and frequency of application.
What is the difference between calibration and verification?
Verification is the comparison performed by a competent body. For certain applications this verification is mandatory. Calibration, on the other hand, is the regular checking of measuring instruments. A traceable standard is used, i.e. one that conforms to the national standard; it is carried out by an accredited competent body and involves the estimation of a measurement uncertainty. Both procedures only describe the checking of the indication quality and not its adjustment.
What is a standard?
A standard is a material or a measuring device for the purpose of determining a unit or several values for materialization, preservation and reproduction. A standard pressure gauge is a pressure gauge of high accuracy that is applied to detect differences between the pressure values of the instrument to be tested and the standard.
What is a working standard?
It is a standard that is routinely used to calibrate or check materialized measurements, measuring instruments or reference materials. A working standard is normally calibrated with a reference standard. A working standard that is used in practice to ensure correct measurements is a check standard or secondary standard.
What is a reference standard?
Standard in general, of the highest accuracy available at a given location and/or organization that serves to derive measurements made at that location.