0% found this document useful (0 votes)
21 views

Basic Measurement Theory

Uploaded by

Chitoge Kirisaki
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views

Basic Measurement Theory

Uploaded by

Chitoge Kirisaki
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

BASIC

MEASUREMENT
THEORY
CPE368
Intended Learning Outcome
At the end of the module, you should be able to:
1. Discuss a measurement system
2. Explain the block diagram of a generalized measurement
system
3. Explain the 3 major sources of noise in a generalized
measurement system
4. Discuss the measurement as part of overall control
system
Measurement Theory
• Branch of Mathematics that is useful in measurement and
data analysis. The fundamental idea of measurement
theory is that measurements are not the same as the
attribute being measure.
• Mathematical statistics concerned with the connection
between inference and data. Measurement theory is
concerned between the connection between the data and
reality. Both statistical theory and measurement theory are
necessary to make inferences about reality.
Basic Principles of Measurements
• Definition of Measurement
• Measurement is the acquisition of information about a
state of phenomenon (object of measurement)in the
world around us.
• A measurement must be:
• Descriptive about the state or object we are measuring. There
must be a relationship between the object of measurement and
the measurement result.
• Selective: it may only provide information about what we wish
to measure (the measurand) and not about any other of the
many states or phenomena around us.
• Objective: the outcome of the measurement must be
independent of an arbitrary observer.
Definition of Instrumentation
• The field of measurement instruments and systems is
called instrumentation.
• Instrumentation system must guarantee the required
descriptiveness, the selectivity, and the objectivity of the
measurement.
Measurement Systems
• Traditionally used to measure physical and electrical
quantities, such as mass, temperature, pressure,
capacitance, and voltage.
• Can be designed to locate things or events, such as the
epicenter of an earthquake, employees in a building, partial
discharges in a high voltage power cable, or a land mine.
• Called upon to discriminate and count objects, such as red
blood cells, or fish of a certain size swimming past a
checkpoint.
• Part of a control system.
Block Diagram of a Generalized
System
Overview of Signal Conditioning
• The voltages or currents obtained directly as the output of a
sensor are generally low level and contain additive noise
and coherent interferences, picked up from the
environment of the QUM and from the sensor itself.
• Sometimes the measurement process or the sensors
introduces a nonlinear distortion of the QUM which must
be linearized.
• The analog signal conditioning module following the
sensor must therefore amplify the sensor output voltage, as
well as perform linear filtering on it in order top improve
the SNR.
Overview of Signal Conditioning
• Such amplification and filtering is usually performed by a
low noise instrumentation amplifier, followed by an op-amp
active filters.
• Compensation for inherent nonlinearities is most easily
done digitally using computers. Digital signal conditioning
can also be used to effectively remove the coherent
interference accompanying the QUM and to calculate
functions of the sampled signal.
3 Major Sources of Noise
Errors in Measurements
Gross Errors – errors in measurement can arise from human
mistakes.
• Reading the instrument before it has reached its steady
state. – produces a dynamic error.
• Not eliminating parallax when reading an analog meter
scale, incorrect interpolation between analog meter scale
markings.
• Mistakes in recording measured data and in calculating a
derived measurand.
• Misuse of the instrument.
Errors in Measurements
System Errors – errors in measurement can arise from such
factors as:
• The instrument is not calibrated and has an offset – loss of
calibration and zero error can occur because of long term
component value changes due to aging, or changes
associated with temperature rise.
• Reading uncertainty due to the presence of random noise –
can come from the signal conditioning electronics in the
system.
Errors in Measurements
System Errors
• Internally generated random noise coming from the first stage
of an instrument’s signal conditioning amplifier – Johnson,
thermal – or from active devices, and some from the
quantization or rounding off that is inherent in the operation of
ADCs.
• Environmental noise can be produced by appropriate electric
magnetic shielding, proper grounding, and guarding practices.
• Slow, or long-term drift in the system can destroy the certainty
of static measurements.
• Drifts – can cause slow changes in system sensitivity and/or zero and
can arise the result of a slow temperature change.
Standards Used in Measurements
• Calibration
• Implies observing the instrument’s performance
when measuring a standard of some sort.
• Necessary, along with precision, to enable
accurate measurements to be made.
Standards
Physical representation of the QUM whose true
value is known with great accuracy.
• Standards can be classified as:
• International Standards
• Primary Standards
• Secondary Standards
• Working Standards
Standards
1. International Standards
• Defined by international agreements and are kept
at the International Bureau of Weights and
Meares in Sèvres, France.
• Example: Kilogram
• Not available on a daily basis for calibration or
comparison.
Standards
2. Primary Standards
• Are maintained in national standards laboratories
in countries around the world.
• Representing some of the fundamental physical
and electrical units as well as some derived
quantities.
• Independently measured and calibrated at the
various national laboratories and compared
against each other.
Standards
3. Secondary Standards
• Reference standards which are initially calibrated
from primary standards and then used in industry
and research labs on a daily basis to calibrate
their working standards.
4. Working Standards
• Calibrated against secondary standards and are
used in daily basis to check and calibrate working
laboratory instruments.
Physical Standards
• Unit – specified and defined measurement of a certain
quantity that is described.
• SI (Système International) system of units – an agreed
system of definitions and standards to describe
fundamental physical quantities by international
committee.
Physical Standards
1. Length
• King Henry I of England (A.D. 1120) – Yard
• King Louis XIV (France) – Foot
• 1799 – Meter became the legal standard length in France
• Meter (SI unit of Length)
• one tenth-million of the distance from the equator of the North Pole.
(1799)
• Distance between two lines on a specific bar of platinum-iridium
alloy stored under controlled conditions (1960).
• Modified to be equal to 1,650,769.73 wavelengths of orange-red light
emitted from a krypton lamp.
• Refined to be the distance travelled by light in vacuum during a time
interval of 1/299,792,458 seconds (1983).
Physical Standards
2. Mass
• Represents a measure of resistance of an object to
changes in its motion.
• Kilogram (SI unit of Length)
• Mass of a specific platinum-iridium alloy cylinder kept at the
International Bureau of Weights and Measures.
Physical Standards
3. Temperature
• Measure of the thermal energy in a body, which is the relative
hotness of coldness of a medium and is normally measured in
degrees using on of the following scales; Fahrenheit (F), Celsius ©,
Rankine ®, or Kelvin (K).
• The SI unit of temperature is Kelvin.
• Absolute Zero – Is the template at which all molecular motion
ceases or the energy of molecule is zero.
Physical Standards
• Fahrenheit Scale (F)
• First temperature scale
• Proposed in the early 1700s by Fahrenheit (Dutch).
• Two point of reference choses for 0 and 100 degrees were freezing
point of a concentrated salt solution and internal temperature of
oxen.
Physical Standards
• Celsius or Centigrade Scale (C)
• Proposed in mid1700s by Celcius (Sweden).
• Proposed that the temperature readings of 0 and 100 degrees for
the freezing and boiling points of water at 1 atm.
Physical Standards
• Rankine Scale (R)
• Proposed in the mid 1800s by Rankine.
• Temperature scale referenced to absolute zero that was based on
the Fahrenheit Scale (a change of 1 degree F = a change of 1
degree R).
• The freezing and boiling point of pure water are 491.6 degree R
and 671.6 degree R, respectively at 1 atm.
Physical Standards
• Kelvin Scale (K)
• Named after Lord Kelvin and proposed in the lates 1800s.
• It is referenced to absolute zero but based on the Celsius Scale (a
change of 1 degree C = a change of 1K))
• The freezing and boiling point of water are 273.15K and 373.15K,
respectively, at 1atm.
• The degree symbol can be dropped when suing Kelvin Scale.
SI Base Units
Measurements as Part of Overall
Control System

You might also like