0% found this document useful (0 votes)
81 views

Systematic Errors (Controllable Errors) - Random Errors.: Measurement Error

This document discusses types of errors in measurements and calibration procedures. It describes two categories of measurement errors: 1) Systematic errors which are controllable and include calibration errors, errors due to ambient conditions, and stylus pressure. 2) Random errors which occur randomly and their magnitude cannot be predicted. The document then provides details on calibration processes, sources of errors, and calibration procedures for vernier calipers and micrometers.

Uploaded by

laxmikanta sahu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views

Systematic Errors (Controllable Errors) - Random Errors.: Measurement Error

This document discusses types of errors in measurements and calibration procedures. It describes two categories of measurement errors: 1) Systematic errors which are controllable and include calibration errors, errors due to ambient conditions, and stylus pressure. 2) Random errors which occur randomly and their magnitude cannot be predicted. The document then provides details on calibration processes, sources of errors, and calibration procedures for vernier calipers and micrometers.

Uploaded by

laxmikanta sahu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 14

Errors in measurements :

 Measurement error:
Measurement error is the difference
between the indicated and true values of
the measurements. Several types of
errors may arise and these errors could be
classified into two categories.
1.Systematic errors (controllable
errors ).
2.Random errors.
1.Systematic errors

These are controllable in both their magnitude and


sense. These can be determined and reduced, if attempts are
made to analyse them. These are also known as controllable
errors. These can be due to
1.calibration errors
2.Error due to ambient conditions
3.stylus pressure
4.Avoidable errors
1.Calibration errors
The actual length of standards such as slip gauges
and engraved scales will vary from nominal value
by small amount

2.Ambient conditions
Variations in the Ambient conditions from
internationally agreed standard value of temperature
200c, barometric pressure 760 mm of mercury, etc..
can give rise to errors in the measured size of the
component. Temperature is by far the most
significant of these ambient conditions and due
correction is needed to obtain error free results.
3.Stylus pressure
Error induced due to stylus pressure is also
appreciable. Whenever any component is measured under
a definite stylus pressure both the deformation of the
work piece surface and defection of the work piece shape
will occur
4.Avoidable errors
These errors include the error due to parallax and
the effect of misalignment of the work piece centers.
Instrument location errors such as placing a thermometer
in sunlight when attempting to measure air temperature
also belong to this category.
Random Errors
These occur randomly and specific cases of such errors
cannot be determined.Eg: Errors due to slight displacement of
lever joints, fluctuation in friction in the instrument and
operators error in reading scale. random errors are those which
are incidental and whose magnitude and size cannot be
predicted. From the above, it is clear that systematic errors are
those which are repeated consistently with repetition of the
experiment, where as random errors are those which are
accidental and whose magnitude and sign cannot be predicted
from knowledge of measuring system and conditions of
measurements
COMPARISON BETWEEN SYSTEMATIC ERRORS & RANDOM
ERRORS
Systematic errors Random errors

These errors are repetitive in These are non-consistent. The


nature & are of constant similar sources giving rise to such errors
form are random
These errors result from Such errors are inherent in the
improper conditions or measuring system or measuring
procedures that are consistent in instruments
action. Specific causes , magnitudes &
Except personal errors, all other sense of these errors cannot be
systematic errors can be determined from the knowledge
controlled in magnitude & sense of measuring system or condition
These errors cannot be
If properly analysed these can be eliminated, but the result
determined & reduced or obtained can be corrected
eliminated These include errors caused due
to variation in position of setting
These include calibration errors, standard & w/p, errors due to
variation in atmospheric displacement of lever joints of
conditions, parallax errors, instruments, errors resulting
misalignment errors etc. from back lash, friction etc
CALIBRATION
Every measuring must be provable. Ie. it must be caused to prove its
ability reliably. The procedure for this is calibration. In order to
maintain the precision & accuracy of measuring device its’ periodical
calibration is essential because the moment an instrument is put into
use it begins to deteriorate in its accuracy. It is also important that the
calibration standard for the system should be at least one order of
magnitude more accurate than the desired measurement system
accuracy. I e ratio of 1 : 10

The process of determining the performance parameters of an


instrument, or system by comparing it with measurement standards.
Adjustment may be a part of a calibration, but not necessarily. A
calibration assures that a device or system will produce results which
meet or exceed some defined criteria with a specified degree of
confidence.
BASIC CALIBRATION PROCESS
The calibration process begins with the design of the measuring instrument that
needs to be calibrated. The design has to be able to "hold a calibration"
through its calibration interval.
In other words, the design has to be capable of measurements that are "within
engineering tolerance" when used within the stated environmental conditions
over some reasonable period of time. Having a design with these characteristics
increases the likelihood of the actual measuring instruments performing as
expected.
The exact mechanism for assigning tolerance values varies by country and
industry type. The measuring equipment manufacturer generally assigns the
measurement tolerance, suggests a calibration interval and specifies the
environmental range of use and storage. The using organization generally
assigns the actual calibration interval, which is dependent on this specific
measuring equipment's likely usage level. A very common interval for 8–12
hours of use 5 days per week is six months. That same instrument in 24/7 usage
would generally get a shorter interval. The assignment of calibration intervals
can be a formal process based on the results of previous calibrations.
INSTRUMENT CALIBRATION
Calibration can be called for:
with a new instrument
when a specified time period is elapsed
when a specified usage (operating hours) has elapsed
when an instrument has had a shock or vibration
which potentially may have put it out of calibration
whenever observations appear questionable
CALIBRATION PROCEDURE
In order to maintain the accuracy of the measuring
instrument the following procedures should be followed
1. Each instrument should be numbered
2. A card record should be established for each instrument
3. Checking interval should be established
4. Some system should be adopted providing adherence to the
checking schedule
5. The record of the findings of the check should be
maintained
6. The record of the checks should be further studied &
analysed to improve upon the system
CALIBRATION OF VERNIER CALIPER

Permissible Error
Parameter
Least count – 0.02mm Least count – 0.05mm

Zero error
0.02 0.05

Flatness of
measuring jaws 0.003 0.004

Parallelism of
measuring jaws 0.01 0.015

Error in reading
0.02 0.05
The zero error is checked by bringing the jaws in
contact & the shift of zero of main scale is
observed with respect to zero of vernier scale
The flatness of measuring jaws is checked using a
straight edge of class 1 accuracy. The straight edge
is put over the surfaces & the light gap is observed
The parallelism checked by inserting a slip gauge of
any value between the jaws at various position &
determining the out of parallelism using slip
gauges
Error in readings along the entire range is also found
out using slip gauges.
gauges
CALIBRATION OF MICROMETER
In case of Micrometers the following are the main points to be checked
1. General appearance & relative movement of moving parts
2. Checking initial zero setting for micrometer
3. Flatness of measuring surfaces
4. Parallelism of measuring surfaces
5. Error at different positions
In general appearance, the micrometer is thoroughly checked for the
presence of scratches, dents etc. on measuring jaws as well as for
corrosion marks, scratches, dents etc. on the surfaces of measuring
drums, for proper working of ratchet system
The relative movement of moving part is also checked for the smooth
movement
The working of lock system is also checked
The zero error of micrometer is checked and if it is found wrong it is adjusted
easily for micrometers of size 25-50mm & more. The size of the setting
piece is checked on interferometer or any other comparator set to read up
to 0.0001mm . The permissible error allowed in its size is 0.001mm for
micrometers up to size of 100mm

The flatness error is checked by keeping optical flat on each jaw. The
maximum permissible error is 0.0009mm

The parallelness errors is also checked using optical flat. The permissible
error in parallelism is 0.002mm for micrometer up to 100mm size and
0.004mm for micrometer above 100mm and up to 200mm size

The error in readings is checked using slip gauges so as to cover the entire
range. The maximum permissible error is 0.004mm for class one
micrometers & 0.008mm for class two micrometers

You might also like