Advanced Manufacturing Process Analysis (Course 4)-Key Takeaways
Advanced Manufacturing Process Analysis (Course 4)-Key Takeaways
Takeaways
Below you will find a number of key points from this course. Defined terms are underlined.
We are in a phase when the integration of digital and physical worlds has begun. Connections
are being enabled by placing sensors on products and machines and connecting them to the
internet and analyzing the resulting flow of data. This convergence of smart machines,
machines with sensors, internet, and data, is referred to as internet of things, or IoTs.
• Sensors on IoT platform generate data. Expressed in different way, the IoT sensors leave
digital threads.
When advanced analysis is performed, raw data is collected, processed, and transformed into
information. The information renders an informed decision or action we can take.
Analytics connects the digital and physical world, and links the product design phase to
manufacturing, product use, and end of life phases. The exciting benefits of analytics include
better designs, better machines, and enhanced manufacturing.
A great example of the use of advanced analysis is predictive maintenance. Predictions based
on analytics can tell us when a machine needs maintenance. This is known as predictive
maintenance, and plays a major role in improved asset utilization.
Discrete manufacturing refers to the manufacturing process in which the output is an individual
unit.
In continuous manufacturing the output typically flows continuously and cannot be counted as
a single unit. The output is typically measured in weight, volumes, and percentages.
Data analysis, regardless of whether or not it takes place in discrete setting or continuous
setting, occurs in four stages:
• Stage one: data collection
• Stage two: data storage
• Stage three: data pre-processing. In this stage the data is refined so that it can be
processed further.
• Stage four: data analysis. In this stage, pre-processed data is converted into actionable
intelligence.
Data has a limited use if it is not converted into actionable information. You can be data rich
and information poor.
There are two different types of data sets, traditional data sets and big data.
• Big data is a term used to describe very large data sets that are very complex to analyze.
• Traditional analysis methods cannot be applied to big data sets in order to generate
actionable intelligence.
Traditional data sets mainly mean that the data has been stored in a centralized location at a
single point, maybe in a single database. Additionally, there is an assumption that the data is in
structured format. These two qualities enable us to do very easy queries on traditional data
sets. Traditional data sets are typically small, with the largest data set ranging between
gigabytes to terabytes.
Data collection is the process of systematically gathering data from various sources suited to
your purpose.
Some common errors that can exist in the data sets that can be eliminated in pre-processing
steps are incomplete data sets, inconsistent data sets and noisy data sets.
• When we mention a data set is incomplete, it means that we do not have all the
required data entries in the overall data set.
• Inconsistency refers to discrepancies in the data.
• In noisy data sets, typically you will have outlier data points which will lead to wrong
information. As a precaution before analyzing noisy data sets, always search for
abnormal data points in it. Checking for noisy data points in the data search is one of the
most important steps in data pre-processing. To make sure that you did not make a
mistake in data collection process.
A common expression, garbage in, garbage out (GIGU) can be used to remember the
importance of having right data for generating actionable intelligence. If you do not have the
right data, it is unlikely you will generate the right information for your purpose.
Sensitivity analysis (sometimes called What-If Analysis) is used to determine why, and how
uncertainty, in different scenarios, effect the overall outcome.
• If some of my input changes, what is the impact on my output? Techniques such as
sensitivity analysis aids in identifying and weighing the source of these uncertainties.
• It also provides the user with a probable glimpse of future. By using sensitivity analysis,
we could explore various scenarios and make better decisions.
An anomaly can be described as a pattern in the data that does not conform to any expected
normal behavior.
• In data analysis, anomaly detection is also known as outlier detection. It is used to
identify abnormal data points that do not conform to any normal pattern or behavior
High performance computing (HPC) refers to the method of aggregating multiple computers
and parallel processing techniques.
• HPC can be used to analyze and convert large amounts of data quickly, efficiently, and
with high reliability.
Cloud is excellent in collecting and analyzing large volumes of data in wide variety of
applications at incredible velocity; all this can be done without significant investments in
infrastructure.