Processes need to be controlled; we all know that. But how good is the process automation, and is it helping us to optimize production?
The answer to first part of that question invites a simple OK/not OK response. The answer to the second part of the question requires some thought. Throw into the mix digitalization, digital twins and Industry 4.0, and you may find the prospect of optimization through automation bewildering.
I am going to look at both parts of that opening question, shed some light on what control and optimization should be doing for your process, and cut through these themes.
Let’s start with “how good is the process automation?” When we ask this question, we are most likely thinking about the basics — e.g., does the batch recipe manager give me the correct set-points for my control loops? How easily can I add/modify recipes? Do my sequences step through OK? Do the control loops regulate OK?
Start with the basics
It won’t come as a surprise that we need to have the basics in good shape before optimizing process operation. There tend to be four culprits when there are issues with the regulatory control loops: (1) poor control loop tuning, (2) valve sizing/stiction, (3) process disturbances or (4) the process variable is simply not controllable in the first instance.
Identifying and remedying problems are a whole discussion for another day! In the interest of brevity, I will move on in the knowledge that challenges at the basic level are addressed and resolved.
Optimize the process
We can now turn our attention to optimizing the process with the control system. There is an essential “leap” to make here: considering multiple process variables and thinking about this as a multivariable optimization problem. Don’t be put off if that sounds like a math problem (although it is!). The concepts of optimization are easy to grasp. Even better, there are tried and tested solutions that have been in real world use for decades.
In some process industries — mainly petrochemicals — multivariable optimization is so commonplace, it’s taken for granted. Dairy processing is some way from this, yet the number of installed applications is growing, mainly on evaporators and spray dryers.
So, let’s get to grips with the concepts. The most fundamental of all is that we have a digital model of the process, relating process inputs to process outputs. This model exists in software and is derived from the actual plant’s real-world operation.
Second, we have constraints on the variables. These are mix of equipment limitations, process limitations and product quality specifications.
Third (and it’s stating the obvious), we know the objective of the optimization. That is, we know what we want to do.
In plain English, these three elements put together are a statement of intent. For example: “Push powder moisture to within 2 std. dev. of specification and maximize throughput, making sure that the exhaust humidity is below 42 grams/kilogram the static fluid bed powder temperature is above 65 degrees C, and keep the evaporator feed pump from going above 95%.” The nice thing about optimization is that the requirement can be simply and explicitly stated.
How it works
OK, so how does the optimization actually work? Perhaps this is most easily answered by looking into the world of advanced process control. Up until this point, my focus has been on optimization and how to find an optimal operating point for the process.
Without losing site of this objective, there is another consideration that we can’t afford to ignore: process variability. This is the hunting ground of model predictive control (MPC). MPC is also a model-based technology but with a different remit: to reduce variability and stabilize the process. Whereas an optimizer is considering steady-state scenarios, MPC operates dynamically, making adjustments to multiple process inputs as fast as every few seconds
As you may have guessed by now, the model is at the heart of an optimization scheme. It is in this area that developments in digitalization, digital twins, machine learning and Industry 4.0 have a lot to offer. Optimization models are often simple empirical regression models computed from data.
A digital twin opens up a world of higher fidelity mechanistic/hybrid models. The benefits are many: obtaining a model directly without plant testing or data gathering, models that can be parameterized by product formulation or equipment scale. Naturally there are drawbacks too: High-fidelity models are more costly and would still require validation against plant data.
With an optimization model obtained, implementation is at the plant automation level. A third-party software package executes the MPC and optimizer, interfacing to the PLC/SCADA to adjust the process control loops automatically. Operator interaction can be as simple as an on/off button on the SCADA or a more informative dashboard.