0% found this document useful (0 votes)
37 views

Chapter 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views

Chapter 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

EE 43/L – INSTRUMENTATION AND

CONTROL
Chapter 1 – Introduction

Engr. Jonathan C. Pacaldo


Instructor
Chapter Objectives:

This chapter will introduce you to instrumentation, the various measurement units used, and the
reason why process control relies extensively on instrumentation. It will help you become
familiar with instrument terminology and standards.

This chapter discusses:

The basics of a process control loop


The elements in a control loop
The difference between the various types of variables
Considerations in a process facility
Units, standards, and prefixes used in parameter measurements
Comparison of the English and the SI units of measurement
Instrument accuracy and parameters that affect an instrument’s performance
Introduction

Instrumentation is the basis for process control in industry. However,


it comes in many forms from domestic water heaters and HVAC,
where the variable temperature is measured and used to control gas,
oil, or electricity flow to the water heater, or heating system, or
electricity to the compressor for refrigeration, to complex industrial
process control applications such as used in the petroleum or
chemical industry.
Instrumentation – is a collection of Instruments and their application
for the purpose of Observation, Measurement and Control.
Reference: ISA std. S 51.1 – (Instrument Society of America).

Instrumentation is defined as measurement and control of process


variables within a production, or manufacturing area.

Ex. Observation – Agriculture, Weather Forecasting…

Ex. Measurement – Pressure, Flow, Temperature…

Ex. Control – Industry, Power Plants, Distribution…


Process control is a system used in modern manufacturing which uses
the principles of control theory and physical industrial control
systems to monitor, control and optimize continuous industrial
production processes using control algorithms.

This ensures that the industrial machines run smoothly and safely in
factories and efficiently use energy to transform raw materials into
high-quality finished products with reliable consistency while
reducing energy waste and economic costs, something which could
not be achieved purely by human manual control.
Process control is the automatic control of an output variable by
sensing the amplitude of the output parameter from the process and
comparing it to the desired or set level and feeding an error signal
back to control an input variable.
Block Diagram of a process control loop
Definitions of the Elements in a Control Loop

Block diagram of the elements that make up the feedback path in a


process-control loop.
Feedback loop is the signal path from the output back to the input to
correct for any variation between the output level from the set level.
In other words, the output of a process is being continually
monitored, the error between the set point and the output parameter is
determined, and a correction signal is then sent back to one of the
process inputs to correct for changes in the measured output
parameter.
Controlled or measured variable is the monitored output variable
from a process. The value of the monitored output parameter is
normally held within tight given limits.

Manipulated variable is the input variable or parameter to a process


that is varied by a control signal from the processor to an actuator.
By changing the input variable the value of the measured variable
can be controlled.
Set point is the desired value of the output parameter or variable
being monitored by a sensor. Any deviation from this value will
generate an error signal.

Instrument is the name of any of the various device types for


indicating or measuring physical quantities or conditions,
performance, position, direction, and the like.
Sensors are devices that can detect physical variables, such as temperature,
light intensity, or motion, and have the ability to give a measurable output that
varies in relation to the amplitude of the physical variable.

Transducers are devices that can change one form of energy to another, e.g., a
resistance thermometer converts temperature into electrical resistance, or a
thermocouple converts temperature into voltage. Both of these devices give an
output that is proportional to the temperature.

Converters are devices that are used to change the format of a signal without
changing the energy form, i.e., a change from a voltage to a current signal.
Actuators are devices that are used to control an input variable in response to a
signal from a controller. Atypical actuator will be a flow-control valve that can
control the rate of flow of a fluid in proportion to the amplitude of an electrical
signal from the controller. Other types of actuators are magnetic relays that
turn electrical power on and off. Examples are actuators that control power to
the fans and compressor in an air-conditioning system in response to signals
from the room temperature sensors.

Controllers are devices that monitor signals from transducers and take the
necessary action to keep the process within specified limits according to a
predefined program by activating and controlling the necessary actuators.
Programmable logic controllers (PLC) are used in process-control
applications, and are microprocessor-based systems.

An Error signal is the difference between the set point and the
amplitude of the measured variable.

A Correction signal is the signal used to control power to the


actuator to set the level of the input variable.
Transmitters are devices used to amplify and format signals so that
they are suitable for transmission over long distances with zero or
minimal loss of information. The transmitted signal can be in one of
the several formats, i.e., pneumatic, digital, analog voltage, analog
current, or as a radio frequency (RF) modulated signal. Digital
transmission is preferred in newer systems because the controller is a
digital system, and as analog signals can be accurately digitized,
digital signals can be transmitted without loss of information.
Example 1.1 The shows the block diagram of a closed-loop flow control
system. Identify the following elements: (a) the sensor, (b) the transducer, (c)
the actuator, (d) the transmitter, (e) the controller, (f) the manipulated variable,
and (g) the measured variable.
(a) The sensor is labeled pressure cell in the diagram. (b) The transducer is labeled converter.
There are two transducers—one for converting pressure to current and the other for
converting current to pressure to operate the actuator. (c) The actuator in this case is the
pneumatic valve. (d) The transmitter is the line driver. (e) The controller is labeled PLC. (f)
The manipulated variable is the differential pressure developed by the fluid flowing through
the orifice plate constriction. (g) The controlled variable is the flow rate of the liquid.
Process Facility Considerations

The process facility has a number of basic requirements including safety


precautions and well-regulated, reliable electrical, water, and air supplies.

An electrical supply is required for all control systems and must meet all
standards in force at the plant. Power failure can mean plant shutdown and the
loss of complete production runs.

Grounding is a very important consideration in a facility for safety reasons.


Any variations in the ground potential between electronic equipment can cause
large errors in signal levels.
An air supply is required to drive pneumatic actuators in most facilities.
Instrument air in pneumatic equipment must meet quality standards, the air
must be dirt, oil, contaminant, and moisture free.

Water supply is required in many cleaning and cooling operations, and for
steam generation. Domestic water supplies contain large quantities of
particulates and impurities, and may be satisfactory for cooling, but are not
suitable for most cleaning operations.

Installation and maintenance must be considered when locating instruments,


valves and so on. Each device must be easily accessible for maintenance and
inspection.
Safety is a top priority in a facility. The correct material must be used in
container construction, plumbing, seals, and gaskets to prevent corrosion and
failure leading to leakage and spills of hazardous materials. All electrical
equipment must be properly installed to code with breakers. Electrical systems
must have the correct fire retardant for use in case of electrical fires. More
information can be found in ANSI/ISA-12.01.01-1999, Definitions and
Information Pertaining to Electrical Instruments in Hazardous Locations.
Instrument Parameters

The accuracy of an instrument or device is the difference between the indicated


value and the actual value.

Accuracy depends on linearity, hysteresis, offset, drift, and sensitivity. The


resulting discrepancy is stated as a ± deviation from the true value, and is
normally specified as a percentage of full-scale reading or deflection (%FSD).
Accuracy can also be expressed as the percentage of span, percentage of
reading, or an absolute value.
The range of an instrument specifies the lowest and highest readings it can
measure, i.e., a thermometer whose scale goes from −40°C to 100°C has a range
from −40°C to 100°C.

The span of an instrument is its range from the minimum to maximum scale
value, i.e., a thermometer whose scale goes from −40°C to 100°C has a span of
140°C. When the accuracy is expressed as the percentage of span, it is the
deviation from true expressed as a percentage of the span.

Reading accuracy is the deviation from true at the point the reading is being
taken and is expressed as a percentage, i.e., if a deviation of ±4.35 psi in
Example 1.6 was measured at 28.5 psi, the reading accuracy would be
(4.35/28.5) × 100 = ±15.26% of reading.
The absolute accuracy of an instrument is the deviation from true as a number not as a
percentage, i.e., if a voltmeter has an absolute accuracy of ±3 V in the100-volt range, the
deviation is ±3 V at all the scale readings, e.g., 10 ± 3 V, 70 ± 3 V and so on.

Precision refers to the limits within which a signal can be read and may be somewhat
subjective. In the analog instrument shown in Fig. 1.6a, the scale is graduated in divisions of
0.2 psi, the position of the needle could be estimated to within 0.02 psi, and hence, the
precision of the instrument is 0.02 psi. With a digital scale the last digit may change in steps of
0.01 psi so that the precision is 0.01 psi.

Reproducibility is the ability of an instrument to repeatedly read the same signal over time,
and give the same output under the same conditions. An instrument may not be accurate but
can have good reproducibility, i.e., an instrument could read 20 psi as having a range from17.5
to 17.6 psi over 20 readings.
Sensitivity is a measure of the change in the output of an instrument for a change in the
measured variable, and is known as the transfer function, i.e., when the output of a pressure
transducer changes by 3.2 mV for a change in pressure of 1 psi, the sensitivity is 3.2 mV/psi.
High sensitivity in an instrument
is preferred as this gives higher output amplitudes, but this may have to be weighted against
linearity, range, and accuracy.

Offset is the reading of an instrument with zero input.

Drift is the change in the reading of an instrument of a fixed variable with


time.

Hysteresis is the difference in readings obtained when an instrument approaches a signal


from opposite directions, i.e., if an instrument reads a midscale value going from zero it can
give a different reading from the value after making a full-scale reading.
Resolution is the smallest amount of a variable that an instrument can resolve, i.e., the
smallest change in a variable to which the instrument will respond.

Repeatability is a measure of the closeness of agreement between a number of readings (10


to12) taken consecutively of a variable, before the variable has time to change. The average
reading is calculated and the spread in the value of the readings taken.

Linearity is a measure of the proportionality between the actual value of a variable being
measured and the output of the instrument over its operating range.

You might also like