MMM unit-1
MMM unit-1
Unit 1
Definition:
Measurement is a comparison of an unknown quantity with aknown fixed quantity of the same
kind. The value obtained on measuring a quantity is called its magnitude. The magnitude of a
quantity is expressed in numbers and in its unit. Measurement helps us make quantitative
statements about how big, how small things are. Without measurement, the final product will
be full of errors. From weight, temperature, length even time is a measurement and it does play
a very important role in mechanical measurement.
Dr N.L Murali Krishna, Professor, Department of Industrial and Production, PESCE, Mandya-571401
Page....
Display
Output
Input Measured
Sensor Signal Record value of
True processor the input
value of variable
variable
Transmit
First, or Sensor -Transducer Stage: The prime function of the first stage is to detect or to sense
the input value. At the same time, it should be insensitive to every other possible input. This is
the element of the system which is effectively in contact with the process for which a variable
is being measured and gives an output which depends in some way on the value of the variable
and which can be used by the rest of the measurement system to give a value to
it.
Second, or Signal -Conditioning stage:
The purpose of the second stage of the general system is to modify the transduced information
so that it is acceptable to the third, of terminating stage. In addition it may perform one or
more
basic operation such as selective filtering, integration, differentiating, or telemetering, as many
as required. Probably the most common function of the second stage is to increase either
amplitude or power of the signal, or both to the level required to drive the final terminating
device. In addition it must be designed for proper matching characteristics between first and
second and between second and third stage.
Dr N.L Murali Krishna, Professor, Department of Industrial ond Production, PESCE, Mandyo-571401
Page....3
1. Accuracy
characteristic of an instrument. it
ACCuraCy is perhaps the most important fundamental static
corresponds to the true or actual value Or
rerers to howclosely the instrument's measurement
accuracy tells us how "correct" the instrument's
tne quantity being measured. In other words,
readings are.
2 Precision
instrument measurements. An instrument can be
Precision measures the repeatability of an that result is
produces the same result, even if
precise without being accurate; it consistently acCCuracy.
more critical than
notaccurate. Precision is crucial when consistency is
3. Sensitivity
small changes in the quantity being
Sensitivity is ameasure of an instrument's ability to detect
the slightest variations, while low
measured. High sensitivity instruments can detect even
difference.
sensitivity instruments may require substantial changes to register a
4. Range
maximum values it can effectively
The range of an instrument defines the minimum and
measure. Understanding an instrument's range is vital to ensure that it is suitable fora particular
application. Using an instrument outside its range can lead to inacCurate readings.
5. Linearity
when plotted
Linearity refers to how closely an instrument's response follows a straight line
input and output
against the quantity being measured. In ideal cases, the relationship between
is linear, making calibration and interpretation straightforward.
6 Hysteresis
Hysteresis is aphenomenon where an instrument's output varies depending on the previous
values of the input. It can introduce errorswhen measuring dynamic or changing quantities and
needs to be accounted for during calibration and use.
7 Drift
Drift is a gradual change in an instrument's output over time when exposed to constant
conditions. It can be caused by factors like temperature, aging components, or environmental
changes. Regular calibration helps mitigate the effects of drift.
8. Dead Time
Dead time is the duration an instrument takes to respond to a change in the input signal.
Instruments with shorter dead times are more suitable for dynamic measurements, as they can
capture rapid changes accurately.
9 Noise
Noise in instrumentation refers to unwanted random variations in the output signal that can
mask the true measurement. Reducing noise is essential to improve the accuracy and reliability
of measurements.
10. Resolution
Resolution defines the smallest increment of input quantity that an instrument can distinguish
or display.Higher resolution instruments can provide more detailed and precise measurements.
Dr N.L Murali Krishna, Professor, Department of industrial and Production, PESCE, Mandyo-571401
Page....4
Inaccuracy of Measurements
as an error. This error may arise
Every measurement carries a level of uncertainty which is known
or any other. So 100% accurate
in the process during measurement due human mistake
defined as the difference
measurement is not pOssible with any method. An error may be
obtain weight of
between the measuredvalue and actual value. For example, in alaboratory the
measurement is 9.8 kg for a given substance, but the actual or known weight is 10 kg, then the
measurement is not acCurate. In this case, the measurement is not close to the known value.
This is called inaccuracy of the instrument.
The term standard is used to denote universally accepted specifications for devices.
Components or processes which ensure conformity and interchange ability throughout a
particular industry. A standard provide a reference for assigning a numerical value to a
measured quantity. Each basic measurable quantity has associated with it an ultimate standard.
Working standards, those used in conjunction with the various measurement making
instruments. The national institute of standards and technology(NIST) formerly called National
Bureau of Standards (NBS), it was established by an act of congress in 1901, and the need for
such body had been noted by the founders of the constitution. In order to maintain accuracy,
standards in a vast industrial complex must be traceable to asingle source,which may be
national standards.
The following is the generalized levelof standards in the national measurement system.
1. Calibration standards.
2. Metrology standards.
3. National standards.
Dr N.L Murali Krishna, Professor, Department of Industrial and Production, PESCE, Mondyg-571401
Page....6
Classification of standards: To maintain accuracy and interchangeability it is necessary that
Standards to be traceable to asingle source, usually the National Standards of the country,
which are further linked to International Standards. The accuracy of National Standards is
transferred to working standards through a chain of intermediate standards in a manner given
below.
NationalStandards.
National Reference Standards.
Working Standards.
Plant Laboratory Reference Standards.
Plant Laboratory Working Standards.
Laboratory Standards.
Line and Endstandard:
When the length is being measure is expressed as the distance between two lines. This is
referred as Line standard.
When the length is being measure is expressed as the distance between two surfaces. This is
referred as end standard.
Line standards are not as accurate as end standards and cannot be used for close tolerance
measurement.
Dr N.L Murali Krishna, Professor, Department of In dustrial and Production, PESCE, Mandya-571401
Page....7
Ita+c=36+d3 Ita+d=36+da
Dr N.L Murali Krishna, Professor, Department of Industrial and Production, PESCE, Mandya-571401
Page...8
Inthe above equation it may be noted that the error due to the
possible misplacing of the lines
between the end faces of theinch blocks are eliminated.
errors in 35
Thus 38 inch end standard has been calibrated and by this method the unknown
inch bar and 2 inch blocks are eliminated.
Calibration of end bar:
calibrating two end bars of each 500mm basic
The following procedure may be adopted for
surface plate and two 500mm bars A and
length.A meter 1000mm calibrated bar wrung to a
B are wrung together to form abasic
length of one meter, which was wrung to the surface
noted. Then
figure (a). The difference in height X1 is
plate adjacent to a meter bar as shown in the
500mm length bars A and B to determine
comparison was made between the two
difference in length as shown in figure (b).
LA=length of 500mm bar A, Lg=length of 500mm bar B
andB.
bar when combined with bars A
X1=difference in length of one meter length
and B.
X=difference in length between bars A
.equation (1)
From figure (a): L-X1=LatLg..
..equation (2)
From figure (b): Lg= LatX
Substituting the equation (2) in (1)
L-X=Lat LatX2
L-X, =2LatX
Production, PESCE, Mandya-571401
Krishna, Professor, Department of lndustrialand
Dr N.L Murali
Page....9
L-X1-X2
2L = L-X1-X2,then L¡=
2
Lg= LatX2
Slip Gauge:
These may be used as reference standards for transferring the dimension of the unit of lengtn
Trom the primary standard to gauge blocks of lower accuracy and for the verification and
graduation of measuring apparatus. These are high carbon steel hardened, ground and lapped
rectangular blocks, having cross sectional area of 30 mm*10mm. Their opposite faces are tlat,
parallel and are accurately the stated distance apart. The opposite faces are of such a high
degree of surface finish, that when the blocks are pressed together with a slight twist by hand,
they will wring together. They will remain firmly attached to each other. They are supplied in
sets of 112 pieces to 32 pieces. Due to properties of slip gauges, they are built up by, wringing
the
Intocombination which gives size, varying by steps of 0.01 mm and the overall accuracy is of
order of 0.00025mm. Slip gauges with three basic forms are commonly found, these are
rectangular, square with centre hole, and square without centre hole.
Wringing Phenomena: is nothing but combining the faces of slipgauges one over the other.
Due to adhesion property of slipgauges, they will stick to each other. This is because of very
high degree of surface finish of the measuring faces.
G,
G,
G
Pressure
Dr N.L Murali Krishna, Professor, Department of lndustrial and Production, PESCE, Mandya-57 1401