INTERVIEW QUESTIONS ON CALIBRATION …?

 

INTERVIEW QUESTIONS ON CALIBRATION …?

 

What is calibration..?

      Calibration is the documented comparison of the measurement device to be calibrated against a traceable reference device.

      The reference standard may be also referred as a “calibrator”.

      Logically, the reference is more accurate than the device to be calibrated.

      The reference device should also be calibrated traceably.

What is Static calibration…?

      It is the process of measuring of static characteristics of any instrument.

      This involves applying a range of known values of static input to the instrument and recording the corresponding output.

      This output data is presented in the tabulated form or graphical form.

What is SI system…?

      The SI system is the international system of units that specifies the basic units used in measurement science.

      SI system determinates 7 base units (meter, kilogram, second, ampere, kelvin, mole and candela) and 22 derived units.

      The base units are derived from constants of nature.

What is Calibration certificate?

      The definition of calibration includes the word “documented”, this means that the calibration comparison must be recorded. This document is typically called a Calibration Certificate.

      A calibration certificate includes the result of the comparison and all other relevant information of the calibration, such as equipment used, environmental conditions, signatories, date of calibration, certificate number, uncertainty of the calibration, etc.

What is Traceability…?

      It was mentioned that the reference standard that is used in calibration must be traceable. This traceability means that the reference standard must have also been calibrated using an even higher-level standard.

      The traceability should be an unbroken chain of calibrations, so that the highest-level calibration has been done in a National calibration centre or equivalent.

      So, for example, you may calibrate your Vernier calliper with a calibrator. The Vernier calliper you used, should have been calibrated using a more accurate reference calibrator. The reference calibrator should be calibrated with an even higher-level standard or sent out to an accredited or national calibration centre for calibration.

      If the traceability chain is broken at any point, any measurement below that cannot be considered reliable.

What is Uncertainty…?

      When you calibrate an instrument with the higher-level device, the process always includes some uncertainty.

      Uncertainty means the amount of “doubt” in the calibration process, so it tells how “good” the calibration process was.

      Uncertainty can be caused by various sources, such as the device under test, the reference standard, calibration method or environmental conditions.

      In the worst case, if the uncertainty of the calibration process is larger than the accuracy or tolerance level of the device under calibration, then calibration does not make much sense.

      The aim is that the total uncertainty of calibration should be small enough compared to the tolerance limit of the device under calibration.

      The total uncertainty of the calibration should always be documented in the calibration certificate.

What is Range in calibration…?

      Range is defined by the upper limit and lower limit of the measured values that instrument can measured.

      The instrument is designed in such a way that its characteristics should not be changed within its range during working.

What is Sensitivity…?

      Sensitivity is defined as the changed in output signal relative to that of input signal at an operating point.

      Sensitivity may be constant over the range of the input signal or it can vary.

      Instruments that having constant sensitivity called as the “linear”.

What is Resolution…?

·         Resolution is defined as the smallest change in the input signal will give a readable change in output of the measuring system at its operating point.

What is Threshold…?

      Threshold of an instrument is the minimum inputs for which there will be an output.

      Below this minimum input instrument read it as zero.

What is Zero of an instruments…?

      Zero of an instrument is refers to a selected datum.

      The output of an instrument is adjusted to read zero at a predefined point in the measured range.

      For example, output of the Celsius thermometer at freezing point of water is zero or output of the pressure gauge at atmospheric pressure is zero.

What is Zero Drift…?

      Zero drift is the change in output from its set zero value over a specified period of time.

      Zero drift occurs due to changes in ambient condition, changes in electrical conditions, aging of component or mechanical damage.

What is Creep…?

      Creep is the change in output occurring over a specific time period while the value that will measure is held constant at a value other than zero and all environmental conditions are held constant.

What is Accuracy…?

      Accuracy is the maximum amount of difference between measured value and actual true value.

      It is usually expressed in terms of percentage.

      It is nothing but the readings of any one property that we got during multiple measurements of same thing and results are nearer to the true value then we can say that the instrument is accurate.

What is Precision…?

      Precision is the difference between measured variable and the best estimate of the true value of the measured variable.

      It is the measured of repeatability.

      Precise results are away from the true value but all measurements for single variable are nearer to each other.

What is Linearity…?

      Linearity describes the maximum deviation of the output of an instrument from a best fitting straight line through the calibration data.

      Most of the instruments are designed in such a way that the output of instrument is the linear function of the input.

What is Hysteresis…?

      Hysteresis is the maximum difference in output at any measured value within the specified range, when the value is approached first with increasing and then decreasing measurand.

      Hysteresis I typically caused by the lag in the action of sensing element of the transducer.

      Hysteresis is represents in terms of the percentage of the output.

What is Error band…?

      It is the band of maximum deviation of output values from the specified reference line or curve.

      A static band is obtained by static calibration.

      It is determine on the basis of maximum deviation observed over at least two consecutive calibration cycles so as to include repeatability.

      Error band accounts for the deviations that may due to nonlinearity, no repeatability, hysteresis, zero shift etc.

 


Comments