Metrology and Measurement
Metrology and Measurement
Syllabus
General concept Generalized measurement system-Units and standards-measuring
instruments- sensitivity, readability, range of accuracy, precision-static and dynamic response repeatability- systematic and random errorscorrection, calibration, interchangeability
Definition
Metrology is the name given to the science of pure measurement. Engineering Metrology is restricted to measurements of length & angle Measurement is defined as the process of numerical evaluation of a dimension or the process of comparison with standard measuring instruments
Need of Measurement
Establish standard Interchange ability Customer Satisfaction Validate the design Physical parameter into meaningful number True dimension Evaluate the Performance
Methods of Measurement
Direct method Indirect method Comparative method Coincidence method Contact method Deflection method Complementary method
Direct method
Measurements are directly obtained
Ex: Vernier Caliper, Scales
Indirect method
Obtained by measuring other quantities
Ex : Weight = Length x Breadth x Height x Density
Comparative Method
Its compared with other known value
Ex: Comparators
Coincidence method Measurements coincide with certain lines and signals Fundamental method Measuring a quantity directly in related with the definition of that quantity Contact method Sensor/Measuring tip touch the surface area
Complementary method The value of quantity to be measured is combined with known value of the same quantity
Ex:Volume determination by liquid displacement
Deflection method
The value to be measured is directly indicated by a deflection of pointer
Ex: Pressure Measurement
Observer
Data presentation element Data processing element
length mass
meter kilogram
m kg
time
electric current
second
ampere
s
A
temperature
amount of substance luminous intensity
Kelvin
mole candela
K
mol cd
area
square meter
m2
volume
speed
acceleration
cubic meter
meter per second
meter per second squared
m3
m/s
m/s2
weight, force
newton
pressure
energy, work
pascal
joule
Pa
J
Supplementary units
Physical Quantity
Unit Name
Symbol
Radian Steradian
rad
sr
Standards
International standards Primary standards Secondary standards
Working standards
International
International Organization of Legal Metrology, Paris International Bureau of Weights and Measures at Sevres, France
India
National Physical Laboratory Dr. K.S. Krishnan Marg New Delhi - 110012 India Phone: 91-11-45609212 Fax: 91-11-45609310 Email: [email protected] or [email protected]
Measuring Instruments
Deflection and null type instruments Analog and digital instruments Active and passive instruments Automatic and manually operated instruments Contacting and non contacting instruments Absolute and secondary instruments Intelligent instruments.
Intelligent instruments
Microprocessors are incorporated with measuring instruments
Help topics
http://www.tresnainstrument.com/education. html
Definition
Sensitivity- Sensitivity is defined as the ratio of the magnitude of response (output signal) to the magnitude of the quantity being measured (input signal) Readability- Readability is defined as the closeness with which the scale of the analog instrument can be read
Definition
Range of accuracy- Accuracy of a measuring
measured quantity
Precision- Precision is defined as the ability of the
Sensitivity
If the calibration curve is liner, as shown, the sensitivity of the instrument is the slope of the calibration curve. If the calibration curve is not linear as shown, then the sensitivity varies with the input.
Sensitivity
This is the relationship between a change in the output reading for a given change of the input. (This relationship may be linear or non-linear.)
Sensitivity is often known as scale factor or instrument magnification and an instrument with a large sensitivity (scale factor) will indicate a large movement of the indicator for a small input change.
Force, F
Output, Vo (V)
Load Cell
Output, Vo
Slope = 5 V/kN
Input, Fi (kN)
Block Diagram:
Input, F (kN) K
Output, Vo (V)
Sensitivity, K = 5 V/kN
Example
(1) A 0.01 W/A meter with 5 A fsd, Rm = W/A x A = 0.01 x 5 = 0.05 W Vmax across the Meter will be = 5 A x 0.05 W = 0.25 V for fsd. (2) A 0.1 W/A meter with 5 A fsd,will drop 2.5 V (i.e., it is 10 times less sensitive), which may bias the results
Readability
Readability is defined as the ease with which
Readability
What is the value ?
Accuracy
Accuracy = the extent to which a measured value agrees with a true value The difference between the measured value & the true value is known as Error of measurement Accuracy is the quality of conformity
Example: Accuracy
Who is more accurate when measuring a book that has a true length of 17.0 cm? A: 17.0 cm, 16.0 cm, 18.0 cm, 15.0 cm B :: 15.5 cm, 15.0 cm, 15.2 cm, 15.3 cm
Precision
The precision of a measurement depends on the instrument used to measure it. For example, how long is this block?
Example: Precision
Who is more precise when measuring the same 17.0 cm book? A: 17.0 cm, 16.0 cm, 18.0 cm, 15.0 cm B :: 15.5 cm, 15.0 cm, 15.2 cm, 15.3 cm
Uncertainty
The word uncertainty casts a doubt about the exactness of the measurement results True value = Estimated value + Uncertainty
Reading a Meterstick
. l2. . . . I . . . . I3 . . . .I . . . . I4. .
First digit (known) Second digit (known) Length reported = 2 = 0.7 =
cm
or
or
2.76 cm
2.78 cm
Performance of Instruments
All instrumentation systems are characterized by the system characteristics or system response There are two basic characteristics of Measuring instruments, they are
Static character Dynamic character
Static Characteristics
The instruments, which are used to measure the quantities which are slowly varying with time or mostly constant, i.e., do not vary with time, is called static characteristics.
Precision
Sensitivity Resolution Threshold Drift
Backlash
True value Hysteresis Linearity Range or Span
Error
Repeatability
Bias
Tolerance
Reproducibility
Stability
Resolution
This is defined as the smallest input increment change that gives some small but definite numerical change in the output.
Threshold
This minimum value of input below which no output can be appeared is known as threshold of the instrument.
Output
input
Drift
Drift or Zero drift is variation in the output of an instrument which is not caused by any change in the input; it is commonly caused by internal temperature changes and component instability.
Sensitivity drift defines the amount by which instruments sensitivity varies as ambient conditions change.
Output
Output
Output
Error The deviation of the true value from the desired value is called Error Repeatability It is the closeness value of same output for same input under same operating condition Reproducibility - It is the closeness value of same output for same input under same operating condition over a period of time
Range
The Range is the total range of values which an instrument is capable of measuring.
Hysteresis
This is the algebraic difference between the average errors at corresponding points of measurement when approached from opposite directions, i.e. increasing as opposed to decreasing values of the input.
Measured Value
Ideal
Zero stability
The ability of the instrument to return to zero reading after the measured has returned to zero
Dead band
This is the range of different input values over which there is no change in output value.
Linearity-
Dynamic Characteristics
The set of criteria defined for the
instruments, which are changes rapidly with time, is called dynamic characteristics.
Dynamic Characteristics
Steady state periodic Transient Speed of response Measuring lag
Fidelity
Dynamic error
Measuring lag
Retardation type :Begins immediately after the change in measured quantity Time delay lag : Begins after a dead time after the application of the input
Fidelity- The degree to which a measurement system indicates changes in the measured quantity without error Dynamic error- Difference between the true value of the quantity changing with time & the value indicated by the measurement system
Errors in Instruments
Error = True value Measured value or Error = Measured value - True value
Types of Errors
Error of Measurement Instrumental error Error of observation Based on nature of errors
Based on control
Error of Measurement
Systematic error -Predictable way in
accordance due to conditions change Random error - Unpredictable manner Parasitic error - Incorrect execution of
measurement
Instrumental error
Error of a physical measure Error of a measuring mechanism Error of indication of a measuring instrument Error due to temperature
Error of observation
Reading error Parallax error Interpolation error
Nature of Errors
Systematic error Random error
Based on control
Controllable errors
Calibration errors Environmental (Ambient /Atmospheric Condition) Errors Stylus pressure errors Avoidable errors
Correction
Correction is defined as a value which is added algebraically to the uncorrected result of the measurement to compensate to an assumed systematic error. Ex : Vernier Caliper, Micrometer
Calibration
Calibration is the process of determining and adjusting an instruments accuracy to make sure its accuracy is with in manufacturing specifications.
Interchangeability
A part which can be substituted for the component manufactured to the small shape and dimensions is known a interchangeable part. The operation of substituting the part for similar manufactured components of the shape and dimensions is known as interchangeability.
Compiled by
D.Vasanth Kumar Assistant Professor Department of Mechanical Engineering Jansons Institute of Technology