0% found this document useful (0 votes)
51 views

Unit 5-Manufacturing

Uploaded by

zulazri91
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views

Unit 5-Manufacturing

Uploaded by

zulazri91
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 120

Unit 5: Manufacturing

Quality Assurance
Outline
• Engineering Metrology and Instrumentation

• Quality Management and Standards

• Testing and Inspection


5.1 Engineering Metrology and
Instrumentation
Metrology
Metrology is referred as the science of measurement. The
measurement operation would normally be carried out using
the measuring instruments to determine the numerical
values of the product feature that is being measured.
Measurement standard

A standard is defined as “something that is set up and


established by an authority as rule of the measure of quantity,
weight, extent, value or quality”. The role of standards is to
achieve uniform, consistent and repeatable measurements
throughout the world. The standards can be classified as
international standards, primary standards, secondary
standards and working standards.
Measurement standard

Hierarchy of standards

6
Measurement standard
International standards are devices designed and constructed
to the specifications of an international forum. They represent
the units of measurements of various physical quantities to the
highest possible accuracy that is attainable by the use of
advanced techniques of production and measurement
technology. These standards are maintained by the
International Bureau of Weights and Measures at Sevres,
France.
Measurement standard
Primary standards are devices maintained by standards
organisations/ national laboratories in different parts of the
world. These devices represent the fundamental and derived
quantities and are calibrated independently by absolute
measurements. One of the main functions of maintaining
primary standards is to calibrate/check and certify secondary
reference standards. Like international standards, these
standards are also not available to ordinary users of
instruments for verification/calibration of working standards.
Measurement standard

Secondary standards are freely available to the ordinary users


of instruments for checking and calibration of the working
standards. Working standards are high accuracy devices that
are commercially available and are duly checked and certified
against either the primary or secondary standards.
Measurement standard

Working standards are used for calibrating general laboratory


instruments, for carrying out comparison measurements or for
checking the quality of industrial products.
Geometric features of parts
The most common quantities and geometric feature that are
measured in products made by manufacturing processes are as
follow:
a. Length – including all linear dimensions of parts
b. Diameter – outside and inside, including parts with different
outside and inside diameters
c. Roundness – including out-of-roundness, concentricity and
eccentricity
d. Depth – such as drilled holes and cavities in dies and moulds
e. Straightness – such as shafts, bars and tubing
Geometric features of parts

f. Flatness – machined and ground surfaces


g. Parallelism – such as two shafts or slideways in machines
h.Perpendicularity – such as a threaded bar inserted into a
flat plate;
i. Angles – including internal and external angles;
j.Profile – such as curvature in castings, forgings and on car
bodies.
Traditional measuring methods and
instruments
Line graduated instruments

a.Steel rule
Ruler is the simplest and most common linear measuring
instrument. The principle behind steel rule or normally called
the Engineer’s steel rule is of comparing an unknown length to
the one previously calibrated. The rule must be graduated
uniformly throughout its length. Rules are made in 150,
300,500 and 1000 mm length with an accuracy of 0.5 mm
Traditional measuring methods and
instruments
b. Vernier caliper
A vernier caliper consists of a sliding scale which is divided
such that the distance between two marks on this scale is
smaller than the distance between two marks on the main
scale. Typically, vernier calipers can measure lengths to
accuracy of 0.1 or 0.05 mm. Most vernier calipers are
equipped with a set of smaller jaws for measuring internal
diameters and a depth probe to measure depths.
Traditional measuring methods and
instruments

1 – Outside jaw 2 – Inside jaw


3 – Depth probe 4 – Main scale (mm)
5 - Main scale (in) 6 - Vernier scale (mm)
7 - Vernier scale (in) 8 - Retainer
Traditional measuring methods and
instruments
To measure outer dimensions of an object, the object is placed
between the jaws, which are then moved together until they
secure the object. The first significant figures are read
immediately to the left of the “zero” of the vernier scale. The
remaining digits are taken from the vernier scale and placed
after the decimal point of the main reading. This remaining
reading corresponds to the division that lines up with any main
scale division. Only one division on the vernier scale coincides
with one on the main scale.
Traditional measuring methods and
instruments

Vernier caliper measurement of 37.46 mm


Traditional measuring methods and
instruments

Vernier caliper measurement of 34.60 mm


Traditional measuring methods and
instruments
c. Micrometer

The micrometer is used to measure even smaller dimensions


than the vernier calipers. The micrometer screw gauge also
uses an auxiliary scale (measuring hundredths of a millimeter)
which is marked on a rotary thimble.The thimble passes
through a sleeve that carries a millimeter scale graduated to
0.5 mm. The jaws can be adjusted by rotating the thimble
using the small ratchet knob. The thimble must be rotated
through two revolutions to open the jaws by 1 mm.
Traditional measuring methods and
instruments
Traditional measuring methods and
instruments
The object to be measured is placed between the anvil and
spindle and the thimble is rotated using the ratchet until the
object is secured. The first significant figure is taken from the
last graduation showing on the sleeve directly to the left of the
revolving thimble. Note that an additional half scale division
(0.5 mm) must be included if the mark below the main scale is
visible between the thimble and the main scale division on the
sleeve. The remaining two significant figures (hundredths of a
millimetre or 0.01mm) are taken directly from the thimble
opposite the main scale.
Traditional measuring methods and
instruments

Micrometer measurement of 7.38 mm


Traditional measuring methods and
instruments

Micrometer measurement of 7.72 mm


Traditional measuring methods and
instruments
d) Height gauge

This is also a sort of vernier caliper, equipped with a special


base block and other attachments which make the instrument
suitable for height measurements. Along with the sliding jaw
assembly, arrangement is provided to carry a removable
clamp. The upper and lower surfaces of the measuring jaws
are parallel to the base, so that it can be used for
measurements over or under a surface. With a scribing
attachment in place of the measuring jaw, this can be used to
scribe lines at certain distance above surface. A dial indicator
can also be attached in the clamp. For all these
measurements, use of surface plates as datum surface is very
essential.
Traditional measuring methods and
instruments
Measuring geometric features
The straightness of a surface is defined as the variation/departure from
a predefined straight line or true mean line. A surface is said to be
straight if the deviation of the distance of the two points from two
planes perpendicular to each other and parallel to the general direction
of the line remains between a specific tolerance limit. The straightness
of a surface can be checked using a straight-edge or knife-edge rule.
An accurate and quick check can be carried out by placing the rule on
the test surface and visually inspecting a gap between the rule and
surface against an illuminated background. Under conditions where the
surface is not straight, it is possible to see a gap of light as small as
0.008 mm.
Measuring geometric features
Straightness of horizontal or vertical surface can also be
checked using a spirit level. Levels have two parts: The
frame has broad, precisely machined flat edges. Into the
frame are set transparent tubes filled with dyed alcohol and a
small bit of air. When the tube is exactly perpendicular to
gravity the air bubble will float in the exact centre of the tube,
marked with a pair of fine lines. This indicates the surface is
straight.
Measuring geometric features
An autocollimator can also be used to inspect the
straightness of a surface as this device can accurately
measure small angular deviations on a flat surface. The
autocollimator projects a beam of collimated light. An
external reflector which is placed on the test surface reflects
all or part of the beam back into the instrument where the
beam is focused and detected by a photodetector. The
autocollimator measures the deviation between the emitted
beam and the reflected beam. Ideally a straight surface will
have a deviation of 0 degree angle.
Measuring geometric features
Because the autocollimator uses light to measure angles, it
never comes into contact with the test surface.
Measuring geometric features
Flatness of a surface is defined as the deviation of the plane
from the best fit plane. Flatness can be measured by
mechanical means using a surface plate and a dial indicator. A
surface plate is a solid, flat plate used as the main horizontal
reference plane for precision inspection and tooling setup. A
dial indicator also known as dial gauge or probe indicator, is an
instrument used for accurately measure small linear distances.
Measuring geometric features
The dial indicator is mounted to the surface plate and is set
to zero at the level of the surface plate using a gage block.
The part to be tested is then placed on the surface plate
and in contact with the contact point of the dial indicator. As
the part is moved randomly on the surface plate, the dial
indicator traces a path across the part surface and
measure the distance between the surface plate and the
low points of the part surface. If the indicator reading is
larger than the flatness tolerance value, the surface is not
within its flatness specification.
Measuring geometric features
Interferometry using optical flat is another method of
inspecting flatness. The surface flatness of the reference flat
must be known and should be frequently calibrated by the
supplier. Both the reference flat and the test surface are
placed in the light box and centrally located under the
monochromatic (single color) light source. A piece of lens
tissue is now placed on the surface of the test piece and
reference flat laid carefully on one edge.
Measuring geometric features
The flat is lowered gradually as the tissue is removed. When
the tissue is finally removed, a thin air wedge remains between
the two pieces. This air wedge gives rise to fringes, which
should be straight, parallel and equally spaced if the surface of
test piece (workpiece) is flat
Measuring geometric features
Roundness is defined as how closely the shape of an object
approaches that of a circle. The circle is referred to as true
roundness. The round part is placed on a V-block or between
centres of the lathe machine and is rotated while the point of the
dial indicator is in contact with the part surface. This is known as
a three-point method and if the part is truly round, with negligible
irregularity, the pointer of the gauge will not move. After a full
rotation of the workpiece, in the case of irregularity, the difference
between the maximum and minimum readings on the dial is
noted. The difference is called the total indicator reading (TIR).
Modern measuring instruments and
machines
Photoelectric digital length measuring system was
designed to provide exceptional measuring accuracy, easy
operation and a clear digital readout. Resolution settings
can range from 5 to 0.01 µm.
Modern measuring instruments and
machines
Laser micrometers uses laser beam to scan the workpiece at
a rate of 350 times per second and capable of resolutions as
high as 0.125 µm. Unlike traditional micrometers, as there is
no physical contact, they can also measure parts that are at
elevated temperatures or too flexible to measure by other
means.
Modern measuring instruments and
machines
The digital caliper is a precision instrument that can be used
to measure internal and external distances extremely
accurately. Digital calipers are easier to use as the
measurement is clearly displayed on a LCD (liquid crystal
display) and also, by pressing the inch/mm button, the
distance can be read as metric or imperial.
Modern measuring instruments and
machines
Coordinate measuring machines (CMM’s) are machines that
allow one to locate point coordinates on three dimensional
objects (X, Y & Z or length – width – height) ,all at the same
time. They allow integration of both dimensions and
orthogonal relationships. When linked to a computer, use of
a CMM eliminates difficult and time-consuming
measurements with traditional single-axis devices such as
micrometers and height gages. Cumbersome mathematics is
eliminated, complex objects can be measured quantitatively
and data can be stored for later use.
General characteristics and
selection of measuring instruments
The characteristics that commonly involved in the selection of
measuring instrument are as below:-
a. Accuracy- how close is the measurement to the true value.
It is more easily quantified by percentage error where:
Percentage error = (indicated value – true value) / true
value x 100%
b. Linearity - it is the maximum deviation from linear
relation between input and output. Normally shown in
the form of full scale percentage (% fs).
General characteristics and
selection of measuring instruments

c. Precision - the reproducibility with which repeated


measurements of the same variable can be made under
identical conditions. An instrument can be precise but
inaccurate and, likewise, it is possible to have an accurate
but imprecise instrument.
d. Resolution – smallest dimension that can be read on an
instrument.
General characteristics and
selection of measuring instruments

e. Range - the maximum and minimum values of the inputs


or the outputs for which the instrument is recommended
to use. For a standard thermometer this is 0 to 100°C.
This is the same as the full scale.
f. Sensitivity – is defined as the ratio of change in output
(response) towards the change in input at a steady state
condition.
g. Stability – an instrument’s capability to maintain its
calibration over a period of time.
General characteristics and
selection of measuring instruments
The selection of an appropriate measuring instrument for a
particular application also depends on:-

a. size and type of part to be measured;


b. the environment (temperature, humidity, dust etc);
c. operator skill required;
d. cost of equipment.
Geometric dimensioning and
tolerancing
A dimension is a numerical value expressed in appropriate
units of measurement and used to define the size, location,
orientation, form or other geometric characteristics of a part.
Tolerance is the amount a particular dimension is allowed to
vary. Hence the tolerance is the difference between the
maximum and minimum limits. Tolerances are unavoidable in
manufacturing because it is virtually impossible to
manufacture two parts that have precisely the same
dimensions.
Geometric dimensioning and
tolerancing

Tolerance is used to produce a range of acceptable diameters


for purposes such as quality assurance and proper functioning
of the rod when it is assembled with other parts. Furthermore
tolerances also play an important role in the assembly of
parts. The fit of one part to another part depends on the
tolerances that each part has.
Geometric dimensioning and
tolerancing
Geometric dimensioning and
tolerancing
The length of the rectangle which comprises of a basic
dimension of 100 mm and the tolerances of maximum limit of
0.07 mm and minimum limit of 0.05 mm. Hence the length of
the rectangle may vary from 99.95 mm to 100.07 mm. The
tolerances for the width of 50 mm and hole of 20 mm diameter
are known as bilateral tolerances and can also be given as 50
± 0.05 mm and Ф 20 ± 0.03 mm respectively.
Geometric dimensioning and
tolerancing

Geometric dimensioning and tolerancing (GD&T) is a


system for defining and communicating engineering
tolerances. It uses a symbolic language on engineering
drawings and computer-generated three-dimensional solid
models that explicitly describes nominal geometry and its
allowable variation. It tells the manufacturing staff and
machines what degree of accuracy and precision is needed
on each controlled feature of the part.
Geometric dimensioning and
tolerancing
There are several standards available worldwide that
describe the symbols and define the rules used in GD&T.
One such standard is American Society of Mechanical
Engineers (ASME) Y14.5-2009. A datum is a virtual ideal
plane, line, point, or axis. A datum feature is a physical
feature of a part identified by a datum feature symbol and
corresponding datum feature triangle, e.g.,
Geometric dimensioning and
tolerancing

Geometric tolerancing reference chart as per ASME Y14.5-2009


Geometric dimensioning and
tolerancing
Figure below illustrates the use of feature control frame in
GD&T. The first box in the control frame gives information on
the geometric symbol. The second box contains information
on the tolerance and the remaining boxes gives information
on the datum. For example in figure below, the
perpendicularity of the surfaces must be within a 0.005
tolerance zone relative to datum A.
Geometric dimensioning and
tolerancing
The relationship between engineering drawing which
incorporate GD&T, the manufactured part and the inspection
method used for conformance of the part features to the
engineering drawing.

Straightness
Geometric dimensioning and
tolerancing
Flatness

Roundness (Circularity)
5.2 Quality Managements and
Standards
Product quality
Quality can be defined either as;
• Fitness for use or purpose.
• To do a right thing at first time.
• To do a right thing at the right-time.
• Features that meet consumer needs and give
customer satisfaction.
• Freedom from deficiencies or defects.
• Conformance to standards.
• Value or worthiness for money, etc
Product quality
Product quality means to incorporate features that have a
capacity to meet consumer needs (wants) and gives customer
satisfaction by improving products (goods) and making them
free from any deficiencies or defects. Product quality mainly
depends on important factors like:
• the type of raw materials used for making a product;
• how well are various production-technologies
implemented;
• skill and experience of manpower that is involved in the
production process;
• availability of production-related overheads like power
Product quality
Several characteristics of product upon which the quality is
determined include performance, durability, reliability,
robustness, serviceability, safety, aesthetics and subjective
perception.
Company must focus on product quality, before, during and
after production. Before production, company must find out the
needs of the consumers. These needs must be included in the
product design specifications. So, the company must design its
product as per the needs of the consumers.
Product quality
During production, company must have quality control at all
stages of the production process. There must have quality
control for raw materials, plant and machinery, selection and
training of manpower, finished products, packaging of
products, etc.
After production, the finished-product must conform (match) to
the product-design specifications in all aspects, especially
quality.
Product quality
Product quality is very important to both the company and
consumers. Product quality is very important for the company
because, bad quality products will affect the consumer's
confidence, image and sales of the company. It may even
affect the survival of the company.
Product quality is also very important for consumers as they
are ready to pay high prices, but in return, they expect best-
quality products. If they are not satisfied with the quality of
product of company, they will purchase from the competitors.
Quality assurance
Quality assurance is any systematic process of checking to
see whether a product or service being developed is meeting
specified requirements. A quality assurance system is said to
increase customer confidence and a company's credibility, to
improve work processes and efficiency, and to enable a
company to better compete with others. ISO 9000 is an
international standard that many companies use to ensure
that their quality assurance system is in place and effective.
Total Quality Management
Total Quality Management (TQM) is a system that
emphasizes the concept that quality must be designed and
built into a product. To make sure that products and services
have the quality they have been designed for, a commitment
to quality through the organization is required. ‘Total’ stresses
the idea that all employees at different levels of the
organisation understand the concepts and work towards
achieving quality. ‘Quality’ means excellence, in all aspects of
the organisation and “Management’ refers to the commitment
towards quality results, through optimum use of resources.
Total Quality Management

Total quality management represents a set of management


principles that focus on quality improvement as the driving
force in all functional areas and at all levels in a company.
These principles are:
1. The customer defines quality, and customer satisfaction
is the top priority.
2. Top management provide the leadership for quality.
3. Quality is a strategic issue and requires strategic plan.
4. Quality is the responsibility of every employee at every
level of the organisation.
Total Quality Management

5. All functions of the company must focus on continuous


quality improvement to achieve strategic goals.
6. Quality problems are solved through cooperation among
employees and management.
7. Problem solving and continuous quality improvement use
statistical quality control methods.
8. Training and education of all employees is the basis for
continuous quality improvement.
Total Quality Management

The foundation for modern TQM programs is the Plan-Do-


Check-Act (PDCA) Cycle.

Act Plan

Check Do
Total Quality Management
In the planning phase, define the problem to be addressed,
collect relevant data, and ascertain the problem's root cause;
in the doing phase, develop and implement a solution, and
decide upon a measurement to gauge its effectiveness; in the
checking phase, confirm the results through before-and-after
data comparison; in the acting phase, document the results,
inform others about process changes, and make
recommendations for the problem to be addressed in the next
PDCA cycle.
Deming methods
Deming’s overall philosophy for achieving quality
improvement is embodied in his 14 points as follow:-
1. Create constancy of purpose for improving products
and services.
2. Adopt the new philosophy.
3. Cease dependence on inspection to achieve quality.
4. End the practice of awarding business on price alone;
instead, minimize total cost by working with a single
supplier.
Deming methods
5. Improve constantly and forever every process for planning,
production and service.
6. Institute training on the job.
7. Adopt and institute leadership.
8. Drive out fear.
9.Break down barriers between staff areas.
10.Eliminate slogans, exhortations and targets for the
workforce.
11.Eliminate numerical quotas for the workforce and numerical
goals for management.
Deming methods

12.Remove barriers that rob people of pride of workmanship,


and eliminate the annual rating or merit system.
13.Institute a vigorous program of education and self-
improvement for everyone.
14.Put everybody in the company to work accomplishing the
transformation.
Juran methods
Juran’s 10 steps to quality improvement cover three main areas
of management decision-making:-
a. Quality planning involves building an awareness of the
need to improve, setting goals and planning the ways for goals
to be reached. This begins with management's commitment
to plan changes. It also requires a highly trained and
qualified staff.
b. Quality control means to develop ways to test products
and services for quality. Any deviation from the standard will
require changes and improvements.
Juran methods

c. Quality improvement is a continuous pursuit toward


perfection. Management analyses processes and systems
and reports back with praise and recognition when things
are done right.
Taguchi Method
Some of the major contributions that Taguchi has made to the
quality improvement are as follow:

a. The Loss Function - Taguchi devised an equation to


quantify the decline of a customer's perceived value of a
product as its quality declines. Essentially, it tells managers
how much revenue they are losing because of variability in
their production process. It is a powerful tool for projecting
the benefits of a quality improvement program. Taguchi was
the first person to equate quality with cost.
Taguchi Method
b. Orthogonal Arrays and Linear Graphs - When evaluating a
production process, analysis will undoubtedly identify
outside factors or noise which cause deviations from the
mean. Isolating these factors to determine their individual
effects can be a very costly and time consuming process.
Taguchi devised a way to use orthogonal arrays to isolate
these noise factors from all others in a cost effective
manner.
Taguchi Method

c. Robustness - Some noise factors can be identified,


isolated and even eliminated but others cannot. For
instance it is too difficult to predict and prepare for any
possible weather condition. Taguchi therefore referred to
the ability of a process or product to work as intended
regardless of uncontrollable outside influences as
robustness. He was pivotal in many companies'
development of products and processes which perform
uniformly regardless of uncontrollable forces.
ISO and QS standards
With increasing international trade, global manufacturing and
price sensitive competition, wide choice of industrial and
consumer products have been developed. Customers
increasingly are demanding high quality products and services
at low prices and are looking for suppliers that can respond to
this demand consistently and reliably. In turn, this trend has
created the need for international conformity and consensus
regarding the establishment of methods for quality control,
reliability and safety of product.
ISO and QS standards
In addition to these considerations, equally important concerns
regarding the environment and the quality of life required to be
addressed. These have therefore, led to the emergence of
quality standards such as the ISO and QS standards.
ISO 9000 standard
The ISO 9000 series is a set of international standards for
quality management and quality assurance. The ISO 9000
standard was originally published in 1987 by the International
Organization for Standardization (ISO). ISO 9000 is widely
recognized in the world. The goal of ISO 9000 is to embed a
quality management system within an organization, increasing
productivity, reducing unnecessary costs, ensuring quality of
processes and products and increasing customer satisfaction.
ISO 9000 standard
ISO 9000 series includes the following standards:-
a. ISO 9001 – Quality systems: Model for quality
assurance in design/development, production, installation
and servicing.
b. ISO 9002 – Quality systems: Model for quality
assurance in production and installation - obsolete since
2000
c. ISO 9003 – Quality systems: Model for quality
assurance in final inspection and test – obsolete since
2000
ISO 9000 standard
Companies voluntarily register to be ISO 9001 certified. For
certification, a company’s plants are visited and audited by
accredited and independent third party teams to certify that
the quality management system (QMS) is in place and
functioning properly.
ISO 9000 standard
The ISO principles are:-
1. A Customer Focus
Customer is the primary focus of a business. By understanding
and responding to the needs of customers, an organization
can increase revenue by delivering the products and services
that the customer is looking for. With knowledge of customer
needs, resources can be allocated appropriately and
efficiently. Business’s dedication will be recognized by the
customer, creating customer loyalty.
ISO 9000 standard

2. Good Leadership
A team of good leaders will establish unity and direction quickly
in a business environment. Their goal is to motivate everyone
working on the project, and successful leaders will minimize
miscommunication within and between departments. Their role
is intimately intertwined with the next ISO 9000 principle.
ISO 9000 standard

3. Involvement of people
The inclusion of everyone on a business team is critical to its
success. Involvement of substance will lead to a personal
investment in a project and in turn create motivated,
committed workers. These people will tend towards
innovation and creativity, and utilize their full abilities to
complete a project. If people have a vested interest in
performance, they will be eager to participate in the
continual improvement that ISO 9000 facilitates.
ISO 9000 standard

4. Process approach to quality management


The best results are achieved when activities and resources
are managed together. This process approach to quality
management can lower costs through the effective use of
resources, personnel, and time. If a process is controlled as
a whole, management can focus on goals that are important
to the big picture, and prioritize objectives to maximize
effectiveness.
ISO 9000 standard
5. Management system approach
Combining management groups if done correctly can result in
an efficient and effective management system. If leaders are
dedicated to the goals of an organization, they will aid each
other to achieve improved productivity. Some results include
integration and alignment of key processes. Additionally,
interested parties will recognize the consistency,
effectiveness, and efficiency that come with a management
system. Both suppliers and customers will gain confidence in
a business’s abilities.
ISO 9000 standard
6. Continual Improvement
The importance of this principle is paramount, and should be a
permanent objective of every organization. Through increased
performance, a company can increase profits and gain an
advantage over competitors. If a whole business is dedicated
to continual improvement, improvement activities will be
aligned, leading to faster and more efficient development.
Ready for improvement and change, businesses will have the
flexibility to react quickly to new opportunities.
ISO 9000 standard

7. Factual approach to decision making


Effective decisions are based on the analysis and
interpretation of information and data. By making informed
decisions, an organization will be more likely to make the
right decision. As companies make this a habit, they will be
able to demonstrate the effectiveness of past decisions.
This will put confidence in current and future decisions.
8. Supplier relationships
It is important to establish a mutually beneficial supplier
relationship; such a relationship creates value for both
parties. A supplier that recognizes a mutually beneficial
relationship will be quick to react when a business needs to
respond to customer needs or market changes. Through
close contact and interaction with a supplier, both
organizations will be able to optimize resources and costs.
QS 9000 standard
The automotive Quality System Requirements, QS 9000, is a
harmonized Quality System Standard that was developed by
Chrysler, Ford and General Motors Supplier Quality
Requirement Task Force. To conform with QS 9000
requirements, an automotive supplier must have designed and
implemented a quality system that makes effective use of a
wide variety of concepts and tools. Even though QS 9000
standard incorporates ISO 9001 plus the automotive
requirements, QS 9000 and ISO 9000 are fundamentally
different in their philosophy and approach.
QS 9000 standard
While ISO 9000 defines quality assurance system requirement
to control product quality, QS 9000’s approach is very much
preventive driven by defining requirements to control and
improve the processes which produce products using a wide
variety of statistical concepts and tools. ISO/TS 16949:2002
replaced QS-9000 in 2006. TS 16949 is more process driven
and is expected to help streamline the quality systems further
ISO 14000 standard
ISO 14000 is a group of voluntary international standards
addressing environmental management systems,
environmental auditing, environmental labelling, environmental
performance evaluation, and life cycle assessment. The ISO
14000 series provides an organization with a systematic
approach to environmental management, and aims to address
environmental issues such as pollution, waste generation and
disposal, noise, depletion of natural resources and energy use.
5.3 Testing and Inspection
Statistical process control
Statistical Process Control (SPC) is an industry-standard
methodology for measuring and controlling quality during the
manufacturing process. Statistics is the study of the collection,
analysis, interpretation, presentation, and organization of data.
Process is a combination of machines, equipment, people, raw
materials, methods and environment that produces a product.
Control refers to the action of directing or regulating a process
so that it behaves the way it is meant to behave. Therefore
SPC is a method of quality control which uses statistical
methods in order to monitor and control a process.
Statistical process control
SPC has become the new standard for quality control
because it:
• Increases customer satisfaction.
• Decreases scrap, rework, and inspection costs.
• Decreases operating costs.
• Improves productivity.
• Sets a predictable and consistent level of quality.
• Eliminates or reduces the need for inspection by the
customer.
Shewarts control charts
Understanding variations is the key to SPC. As variation is
always present in manufacturing processes and cannot be
prevented, an operator must learn to recognize the types of
variations and control the variables that can be controlled.
There are two types of variations: -inherent and assignable.
Inherent variation is the kind that can’t be prevented. It is
always present, and happens randomly, or without a set
pattern. Inherent variation can’t be eliminated or corrected.
Example of inherent variation is machine vibration and
inherent material variation.
Shewarts control charts
Shewarts control charts
Shewarts control charts
Shewarts control charts
The standard deviation of the process deviation is:-
X-bar and R charts
Sample size A2 D3 D4 d2
(n)
2 1.880 0 3.267 1.128
3 1.023 0 2.574 1.693
4 0.729 0 2.282 2.059
5 0.577 0 2.114 2.326
6 0.483 0 2.004 2.534
7 0.419 0.076 1.924 2.704
8 0.373 0.136 1.864 2.847
9 0.337 0.184 1.816 2.970
10 0.308 0.223 1.777 3.078
Out of Control Conditions:
Out of Control Conditions:
1. If one or more points falls outside of the upper control
limit (UCL), or lower control limit (LCL). The UCL and LCL
are three standard deviations on either side of the mean -
see section A.
2.If two out of three successive points fall in the area that is
beyond two standard deviations from the mean, either above
or below - see section B.
3. If four out of five successive points fall in the area that
is beyond one standard deviation from the mean, either
above or below - see section C.
Out of Control Conditions:

4. If there is a run of six or more points that are all either


successively higher or successively lower - see section D.
5. If eight or more points fall on either side of the mean (some
organization use 7 points, some 9) - see section E.
6. If 15 points in a row fall within the area on either side of the
mean that is one standard deviation from the mean - see
section F.
When the out-of-control conditions occur, the process
will be stopped. The root cause for the problem will be
investigated and corrective actions are taken before the
process is continued. SPC continues until the stability of
the process is in control.
Process capability
Process capability is the measure of process performance.
Capability refers to the ability of a process to make parts that
are well within engineering specifications. Process capability is
only carried out once the process is considered statistically
stable and in control with the use of control charts. A process
is considered highly capable when the entire normal
distribution of the process falls within the upper limit and lower
limit of the design specification and is centred between those
limits.
Process capability
The indices that are used to study the capability of a
process are the Cp and Cpk. The Cp assess whether the
natural tolerance of the process, 6σ, is within the design
specification limits
Process capability

Cp= USL – LSL / 6σ


Process capability

Cpk relates the scaled distance between the process mean


and the nearest specification limit

Cpu = USL - µ / 3σ

Cpl = µ - LSL / 3σ

Cpk = min { Cpu , Cpl }


Process capability
Non destructive testing

Non-destructive testing (NDT) is carried out in such a manner


that product integrity and surface texture remains unchanged.
NDT is mainly used for evaluating materials, components or
assemblies for discontinuities, or differences in characteristics
without destroying the serviceability of the part or system. In
other words, when the inspection or test is completed the part
can still be used.
Non destructive testing

Modern non-destructive tests are used in manufacturing,


fabrication and in-service inspections to ensure product
integrity and reliability, to control manufacturing processes,
lower production costs and to maintain a uniform quality level.
Non destructive testing
The NDT methods are as below:-
a. Acoustic Emission Testing (AE);
b. Electromagnetic Testing (ET);
c. Guided Wave Testing (GW);
d. Ground Penetrating Radar (GPR);
e. Laser Testing Methods (LM);
f. Leak Testing (LT);
g. Magnetic Flux Leakage (MFL);
h. Microwave Testing, Liquid Penetrant Testing (PT);
i. Magnetic Particle Testing (MT);
j. Neutron Radiographic Testing (NR);
k. Radiographic Testing (RT);
l. Thermal/Infrared Testing (IR);
m. Ultrasonic Testing (UT);
n. Vibration Analysis (VA) and
o. Visual Testing (VT).
Destructive testing
The parts tested using destructive testing (DT) no longer
maintains its integrity, original shape or surface
characteristics. Destructive tests are often used to determine
the mechanical properties of materials such as impact
resistance, ductility, yield and ultimate tensile strength,
fracture toughness and fatigue strength. In other words, when
the inspection or test is completed the part cannot be used
further.
Destructive testing

Destructive testing is generally most suitable and economic for


mass produced parts, as the cost of destroying a small number
of parts is negligible. Destructive tests are best when used
together with non-destructive methods: this combination gives
the best information on materials and welds. Non-destructive
tests show if cracks, corrosion or other faults exist. Destructive
tests in turn indicate how and when the objects are in danger of
breaking down or failing.
Destructive testing

The DT methods that find application in manufacturing are


as below:-
a. Tensile Test;
b. Notch Toughness Test;
c. Bend Test and
d. Nick Break Test.
Inspection

Inspection has become an essential part of any


manufacturing system. It is the means of rejecting
nonconformities and assuring good quality products.
Inspection can be divided into manual inspection and
automated inspection.
Inspection

Manual inspection, performed by human operators, is generally


a traditional inspection method. Manual inspection is also
known as post process or off line inspection as individual parts
and assemblies upon manufactured in batches, are sent to the
quality control room for inspection by the quality inspectors. If
the parts do not pass the quality inspection, they are either
scrapped or reworked. Manual inspection does not allow 100%
inspection of all the parts that are manufactured as it will be
costly and time consuming.
Inspection

Visual inspection can be considered to be a manual


inspection. Manual inspection is obviously prone to human
error (both Type I error -occurs when a good lot is rejected
and is called producers risk and Type II error – occurs when
a bad lot is accepted and is called consumers risk),but on
the contrary is a fast and cheap inspection method.
Inspection

In the past few decades, massive growth has taken place in


the sensor and computer technology and this resulted in the
wide ranging acceptance of automated inspection systems for
the maintenance of strict quality standards. Naturally,
emergence of automated inspection system has put the
manual inspection process in the back seat due to advantages
in terms of accuracy and time saving.
Inspection

For the mass production, automated systems are used for 100%
inspection. Automated inspection is also known as in process or
on line inspection as inspection is mainly done as the parts are
manufactured. Automated inspection may also be used for post
process inspection; for example the use of coordinate measuring
machine (CMM) to check the dimension of an object.
Inspection
Even though manual inspection is largely replaced by
automated inspection as errors are reduced to great extent by
automation of the process, economic justification of an
automated inspection system depends on whether the savings
in labour cost and improvement in accuracy will be more than
the investment and/or development costs of the system. The
use of machine vision for detecting defects in integrated circuit
chips or printed circuit boards (inspection tasks of PCB are
complex and difficult for human workers) is an example of an
automated inspection system.

You might also like