0% found this document useful (0 votes)
32 views

ControlTalk Oct-2023

Uploaded by

jjmm1969
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views

ControlTalk Oct-2023

Uploaded by

jjmm1969
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

CONTROL TALK

Modeling and control opportunities, Part 1

How dynamic modeling and digital twins can accelerate industrial operations

by Greg McMillan

Greg: Dynamic modeling has been the most important technology in my 50-year career. It’s the source
of all my deeper process control knowledge. My work with compressor surge modeling in 1976 opened Formatted: Font colour: Auto
the door for me to go from being an instrument design and construction engineer to a process modeling
and control specialist in Monsanto’s Engineering Technology (ET), where I worked with brilliant experts,
who were proficient in steady-state and dynamic modeling. The steady-state modeling software we
developed was donated to the Federal Aspen Research Institute, which eventually led to AspenTech.

With ET, I used models to develop, prototype and test innovations that improved process performance
by employing better control strategies, valves, PID algorithms and measurements. This was particularly
important for the more challenging applications such as bioreactor control, centrifuge control,
compressor control, dryer control, exothermic reactor control, fermenter control, furnace pressure
control and pH control. Also, modeling the fundamentally different dynamics of self-regulating,
integrating and runaway processes enabled me to extrapolate principles to address the possibilities for
nearly all applications.

Greg Shinskey, whose publications are my greatest resource, and many of the people I’ve written about
in these Control Talk columns use models to explore the capabilities of PID control. Some of my closest
associates, who are experts in distillation control, use steady-state models to define relative gain
matrices, and find the best tray for column temperature control. Many of my publications and the
sections I wrote for the ISA-TR5.9-2023 PID, Algorithms and Performance Technical Report are a
reflection of what I learned via modeling.

It's disappointing that process control engineers are [probably AREN’T???] not given time by
management to invest in using modeling to improve process performance. I personally regret not Formatted: Font colour: Auto
making the extra effort five years ago to help engineers interested in modeling to actively seek and start
their potential applications. Engineers are increasingly stressed to the limit against tightened project
schedules and budgets. How can we motivate and train engineers to turn this problem around, and let
them promote the value of models?

To broaden our horizons and better understand how to make the most of these opportunities, I asked
José María Ferrer, who has more than 25 years of experience in the dynamic simulation and control of
hydrocarbon processes. He began his career as a process control engineer at Dow Chemical in 1995, and
joined Hyprotech as EMEA Operator Training System (OTS) business development leader in 2001. In
2004, as AspenTech’s senior consultant, he executed several dynamic simulation projects applied in new
areas such as emergency shut down (ESD) verification and advanced process control (APC). He’s been
developing the APC business in Europe, and executing several dynamic simulation projects to support
APC implementations and new Aspentech OTS offerings. In 2010, he joined Inprocess to offer dynamic
simulation services and launch the OTS business. Since 2014, he’s been developing and teaching a new
simulation training course specially tailored to process control engineers. Since 2018, he’s also been
analyzing using online simulations to support operations in anomaly detection, and in 2019, he began
leading projects to exploit simulation for offline and online applications.

José, what can we do to educate and motivate process control engineers to take advantage of
modeling?

José: We can make them aware of the value they’ll get from placing variable economic compensation
based on controller performance. I remember my first day at Hyprotech. I was given a laptop and a two-
hour tour of the simulation tool building a dynamic model. Since then, I’m still amazed at the capabilities
of dynamic and steady-state process simulators. Also, there’s still a lot of thinking in silos, where process
simulation is only for process engineers (designs/revamps mainly in steady state), and
automation/control departments should not touch that simulation territory. This leaves the “dynamic
simulation” in no-man’s land. Building high-quality dynamic models is a time-consuming task and
requires some experience. You must choose the right simulation tools and understand the limitations.

We need to explain that it’s one thing to build a dynamic model and another to use it. I don´t see all
process control engineers at a company as model builders; maybe it’s only a small group. The rest can
be true users of the built models. If you don´t have that small group, you can contract experts to make
them for you. We’re constantly doing simulation projects for advanced and basic process control
departments when things become complex.

Greg: What are the differences between steady-state and dynamic models, and what are the uses and
cases?

José: Steady-state simulation (where time doesn´t exist, and there are no inventories or accumulation)
started in 1950, using machine code developed for limited-scope and single-use, Nowadays, it’s large-
scope and multi-use, but mainly for process design.

Dynamic simulation (where you have time, inventories and controllers as in a real plant) followed the
same path but 10 years later due to greater computational requirements. I’m truly amazed by the
capabilities of personal computers (PC) today; how large, how fast, and how detailed you can build a
dynamic model. They’re the same PCs that run flight simulators. You can compare Microsoft’s Flight
Simulator software from the 1990s to today’s versions. The last 30 years have been amazing for both
flight and process simulators.

Beyond design, steady-state models are valuable in control areas, especially for developing inferentials,
column composition profile dynamics, optimal sensor location, APC, deep reinforcement learning (DRL)
benefit estimation, column feed locations and optimal targets. There’s also value for addressing the
impact of controlled variables (CV), manipulated variables (MV) and disturbance variables (DV) via
CV/MV and CV/DV gains for the whole operational envelope.

Beyond OTSs and instrument control and safety system (ICSS) checkout, dynamic models are valuable in
control areas for benchmarking alternative control schemes, tuning PIDs in complex processes (for
example, slug flows), obtaining plant curve responses for all plant states for APC/DRL multivariable
controllers, tuning and testing APC/DRL, and getting online models for inferentials.

When I was at AspenTech, I tried to create a three-day course to teach steady-state and dynamic
modeling to process control engineers, who hadn’t previously seen simulations. My managers asked for
a business case for such a course. In 2010, at Inprocess it only took me one minute to convince my boss
to create it.

Greg: I’ve used the term “virtual plant,” but was recently made aware of the preferred “digital twin”
term that’s aligned with the fervor over “digitalization.” For me, the key feature is the ability to use the
actual control system configuration and operator interface, eliminating the difficult and perilous task of
recreating these by interfacing or downloading the actual software. This inherently makes empowering
new PID features, data analytics tools and model-predictive control readily available. The cost of a digital Formatted: Font colour: Auto
twin is an order of magnitude less than the older modeling technology for OTS that required programing
the algorithms and interfaces besides the simulation. These systems could rarely be used for process
control improvement because the control capability programmed was very limited or wrong. Today’s
digital twins offer incredible opportunities to find and quantify opportunities noted in my earlier Control
Talk column “Simulation breeds innovation” and my Control articles “Virtual plant virtuosity” and Formatted: Font colour: Auto
“Bioreactor control breakthroughs: Biopharma industry turns to advanced measurements, simulation Formatted: Font colour: Auto
and modeling.” For my take on building and using key performance indicators (KPI) and inferential Formatted: Font colour: Auto

measurements, see “Control Talk: Top of the bottom line.” Formatted: Font colour: Auto
Formatted: Font colour: Auto

José, what do you see as the value of digital twins? Formatted: Font colour: Auto
Formatted: Font colour: Auto

José: I don´t like the phrase “digital twin.” Nowadays, people use it to name almost everything from a Formatted: Font colour: Auto

three-dimensional model to a statistical model to a mechanical model, even an OTS. It’s applied, not Formatted: Font colour: Auto

only to a processing plant, but also to an airplane, airport, building, wind turbine or human heart. Formatted: Font colour: Auto
Formatted: Font colour: Auto
Formatted: Font colour: Auto
To avoid confusion, and talk about the process industries we work in (mainly oil and gas, refining and
chemicals), I prefer to call this topic real-time simulation (RTS). This is a detailed, dynamic-simulation
model running online and in synchrony with the real plant. Usually, this is the second life of the so-called
multi-purpose dynamic simulator (MPDS), which covers dynamic studies, early-OTS, control narrative
verification, operating procedures development, ICSS checkout and direct-connect OTS.

I remember the first time that I ran a dynamic model of a C3 splitter against one week at one minute,
sampling historical process data using a simple Excel macro, and obtained a very good match with the
bottom online analyzer. I saw the potential of using it in real-time and shared my thoughts within the
company, but surprisingly my managers were not impressed.

If we talk about the value of digital twins, the first thoughts that come to mind include ensuring your
plant is running as it should run every second, and if it’s not, detecting this immediately. This raises the
ability of early detection and diagnosis of small anomalies, which normally grow with time. Then, you
have extra value gained from virtual instrumentation (pressure, temperature, flow), equipment KPIs,
emissions KPIs, what-if analyses for present time of any (blue) parameter, and historical model
repositories. The list goes on and on.

Greg: Some closing thoughts from this discussion include asking how can process engineers achieve Formatted: Font colour: Auto
dynamic simulation speedup for slow-continuous and batch processes? For large distillation columns,
the time to steady state can be 12 hours or more. For bioreactors to produce modern biologics, the
batch cycle time can be 12 days or more. The controller should have the same speed up, otherwise
special care must be taken to reduce dead time and modify scales and tuning.

For speed up of five to 10 times real-time, models that exist in the virtual plant controller can have the
same speedup factor as the control system and operator interface. This should suffice for slow-
continuous operations. For much faster speedups needed by bioreactors, media speedup factors can be
[MISSING WORD???]20 times real-time, and kinetic speedup factors can be 10 or more. The total
speedup for media is the product of the two factors and is 200 or more.

The capacity of final control elements (control valves and variable speed drives) and flow-measurement
spans must be increased by the kinetic speedup factor, so the flow controller tuning doesn’t change
much. However, the primary time constant will decrease and the integrating process gain will increase
by the total speedup factor. The total loop dead time must be decreased proportionally to the speedup
factor for composition, dissolved oxygen, pH and temperature loops to avoid a large disruption to the
tuning needed.

For more on bioreactor virtual plant speed up, see the ISA book, New Directions in Bioprocess Modeling Formatted: Font colour: Auto
and Control Second Edition. Formatted: Font colour: Auto

Formatted: Font colour: Auto


SIDEBAR:

Top 10 things you don’t want to hear during a new plant startup

10. We just did a copy of another plant’s configuration and interface.


9. The vintage process and instrument diagrams told us everything we needed.
8. The mechanical engineers will ensure the equipment and piping work well.
7. We use tight shutoff rotary valves, so we have the greatest capacity and least leakage.
6. All transmitters are at ground level.
5. All analyzers are in an analyzer house.
4. All alarms available were used and set based on vintage process flow diagrams.
3. Advanced operator graphics eliminated the need for an OTS.
2. We used tieback simulations with the gain and time constant equal to one.
1. Controllers all have tuning settings with gain and reset in repeats per minute equal to one.

You might also like