Knowledge Base /
White Papers
The Role of Dynamic Simulation in Procedural Automation/State Based Control and Modern Controls
By Dustin Beebe - ProSys
Product: Mimic Simulation


It is a long standing joke in the controls community that the rest of the industrial world views Alarm Management software as, just that, exclaiming "It's only software!" This phrase is often invoked when a controls engineer is working on a particularly complex set of controls or automation that will solve a process issue and deliver value to the business. The irony of the phrase provides a break in the tension as the serious effort of “programming” the control system to control the second-to-second operation of the process. If one is an experienced controls engineer, he remembers the quirk in initialization that has been present for 20 years. If not, he might only know about the new bug in the analog input channel discovered after the control system upgrade last week.

The Value of Necessary Complexity

The value of the control system and automation became a key element in the profitability, safety and reliability of our process operations. Whether those control strategies are individual, or wrapped in a larger framework, such as Procedural Automation or State Based Control, there exists a necessary complexity or intricacy within the design and operation of the process and its associated controls. Internally, we refer to complexity that is required to meet business objectives as “necessary complexity.” A standard management system was developed for certain areas for managing the complexity and standardizing some of the terminology. An example of this is S88 Batch and Recipe Management. The solution of Continuous Processes remains more elusive. There are “best practices” and naming conventions, but there has not been an overarching philosophy to manage complexity and provide standard terminology. The work process for the automation of continuous processes is poorly defined. This applies from selection of opportunities - all the way through implementation. Some organizations applied good practices from general project management principles. However, beyond commissioning activities such as Factory Acceptance Testing (FAT), Site Acceptance Testing (SAT) and loop check, the controls community has little to no guidance. As a result, we see costs related to maintaining and upgrading the control system go up, while business goals fail to be met.

Managing Controls Complexity

Procedural Automation is trying to provide a framework for managing the complexities around process automation in continuous processes. So far the ISA 106 community produced a technical report on the model which establishes a lot of terminology. In the near future a technical report on work processes will be issued. (ProSys is a member of the ISA 106 committee.) Continuous processes evolved from single loop controls, adding more functionality to meet business needs and as the control system provided the ability. Single loop controls were simple single-input and single-output (SISO) controls. As we automate, we have multiple inputs and multiple outputs in a large control strategy that might vary based on process objectives. With single loop controls, a loop check with startup coverage was adequate to verify a control strategy and insure proper operation. With large control schemes, the same cannot be said. Depending on the level of automation, there are multiple cross linkages and regulatory, as well as discrete logic to deal with. The loop check is no longer adequate to reduce the risk to the business on startup.

We are not the first industry to see or experience this problem. Software development began seeing the issue of complexity and value in the early 1980s. There was an increased demand for more functionality from the software products that we were using on our desktops and laptops. Speed of innovation and implementation of functionality was key to winning in the software world. This resulted in code that was being developed at a rapid pace with increased complexity and user requirements. It was soon realized in the marketplace that smaller, more agile companies had a decided advantage. One of the key factors was the way that these companies made decisions and developed software. Instead of a large project with major gates, they employed an iterative process with many smaller gates. These methodologies coalesced into numerous software development processes referred to as Agile software development, the most popular of which is called Scrum.

Agile software development focused on making the customer the highest priority, meeting requirements vs. just coding to a specification. Some of the core tenants of agile software development are:

  • harnessing change for competitive advantage,
  • delivering working software frequently,
  • working together as a team enabling technical resources to develop and
  • involving customers and decision makers to review, building only what is necessary to meet the requirements
  • and allowing the team to adjust to meet requirements.

In procedural automation and modern controls, we can learn from these same principles.

Process is King

The fundamental basis for controls and automation is the physical process. We cannot bend the rules of physics or thermodynamics. On the other side of the equation, we have computers that perform exactly the tasks they are given, nothing more and nothing less (hopefully). These are two very rigid customers, but we also have operations and business goals. We need to make sure to provide an effective interface to operations to utilize the application and achieve the desired business goal. To be effective, we have to bring our “customers” (process, control system, board operator and process engineer) to the table. The process engineer and board operator are easy, as we can schedule time and meet. For the control system, it just takes a bit more and having a dynamic process simulation is the way to go.

Many vendors offer a simulation package that we can load to simulate all of the functions within the control system. For the process, we need a virtual analog of our existing process – a dynamic process simulation. In developing and maintaining the process simulation, we need to apply the rule of necessary complexity. We need to continually ask the question of what is necessary to meet our business goals. This is the area where most process simulations go wrong. There has been a philosophy of high fidelity or bust for many. When weighing this option, make sure to consider not only the cost of creation, but the cost of maintaining. Also keep in mind that the needs for testing automation are different than a traditional operator training simulation package. We normally determine the fidelity on a variable-by-variable basis starting out with low and promoting to medium or high as necessary.

Unit Operations objects are used to quickly develop accurate models for procedural automation testing (shown using Mimic Advanced Modeling Objects)
Unit Operations objects are used to quickly develop accurate models for procedural automation testing (shown using Mimic Advanced Modeling Objects)

Work Processes

There are numerous work processes that can be employed based on the situation. They can vary from capital project to continual improvement. In a capital project, the justification and design will traditionally follow a more gated method (waterfall). However, if you do not have the organizational or personnel knowledge required for the level of automation on the project, then a more agile work process will be required. If you do not have a toolkit that is proven, then you will need to prototype control modules and interface elements early.

After justification and design, the control engineer will work out the solution on the simulation system and test against the process simulation. When that is complete, the process engineer and board operator will test the solution on the simulation system with the process simulation. This brings together all of the user requirements at one-time.

Automated Testing Tools (such as Mimic Test Bench) can reduce manpower costs and increase the consistency of testing of procedural automation
Automated Testing Tools (such as Mimic Test Bench) can reduce manpower costs and increase the consistency of testing of procedural automation

Similarly, for maintenance or continual improvement, the controls engineer can use the simulation system to work out the control scheme or automation. This speeds development and enables personnel to work more effectively. One advantage is that it encourages experimentation and learning. If it is done on the live system, there is a tendency to move to implement before it is fully vetted. As a result, we can end up bloating the control system with unnecessary solutions or experiments.

Additionally, the cost of control system upgrades can be reduced by loading updates on the simulation system before on the live control system. This allows for quick check-out and works out the installation requirements.

The negative against process simulations was the cost to build and maintain them. There are a number of reasons why the game changed. First, the recognition that there are more options than high fidelity. Second, control system complexity caused engineering costs to rise and having a simulation system will reduce risk and encourage action. Third and most important, the change in control system technology and process simulators allow them to be implemented in a cost effective manner without costly proprietary hardware.


Many times in plant operation, we can get in a rut and just address day-to-day solving problems that keep the plant running. While it’s obviously important to keep the plant running, we should be mindful and find value in the process, which improves our operating discipline to meet business goals. As an industry, we have many challenges in front of us. Recognizing business goals and using frameworks such as procedural operation and dynamic simulator are top ways to meet business and safety objectives. We need to keep in mind and identify technology shifts that enable teams to work together. Then we discover knowledge is key.

About the Author

Dustin Beebe, P.E. is the President and CEO of Prosys. Dustin guides the operations of the company by setting and communicating ProSys’ vision of maximizing the customer’s control system. He ensures that each department is committed to providing innovative services and software to solve our customers’ ever-increasing challenges. Dustin believes that continuously advancing technology is the key to providing value to customers.

In addition to his executive duties, Dustin continues to directly serve customers by using his experience and expertise on alarm management and advanced control by assisting them in dealing with complex unit operating problems.

Dustin holds a bachelor’s degree in Chemical Engineering from the University of Arkansas in Fayetteville, Arkansas. He earned his Professional Engineer designation in 2001.

To learn more about how to use dynamic simulation to deliver automation value with necessary complexity contact ProSys to schedule a demo or a call