I plucked this story from a growing and vibrant, “inside the Emerson firewall” community. This community connects Emerson global sales, project, and application folks together, primarily to ask about references–“has this has been integrated with that” sort of exchanges. The particular question was about examples of distillation control.
Emerson’s Doug White responded with an example of a European refiner with the business objectives to maximize throughput, maximize the value of product recovery, maximize heat recovery, and maximize heater efficiency to improve overall energy efficiency. The fractionating distillation process involved fired heaters and atmospheric, stripping, and vacuum distillation towers.
The Emerson team, led by Project Manager Chibuike Ukeje-Eloagu, worked with the refinery’s engineering and operations staff to plan and execute this project. The project implementation plan was first to conduct a site survey to gather data on current performance and perform preliminary step testing to understand the process dynamics of this unit.
Next the team would design functional, detailed and acceptance test specifications for review, iteration, and acceptance by the refinery project staff. After this design phase was completed, next would come the build phase where the advanced process controllers (APC), steps tests of manipulated variables (MV) / disturbance variables (DV), and models would be developed.
The final commissioning step would be to commission the controllers, train the engineering and operations staff, and conduct the site acceptance test per the test specifications. An important final step was to benchmark the process’ performance, compare against the original process data collected, and calculate the return on investment for this optimization project.
Model predictive control (MPC) embedded in the refinery’s DeltaV control system was employed because the process had large interactions. These interactions made single and cascade loop control strategies difficult to implement and maintain over time. The process had a number of disturbances for which the model needed to account. It also took a long time for the process to reach steady state conditions. The solution was to create five APC controllers–one for each fired heater, one for the atmospheric tower, reflux drum, and stripping towers, and one for the vacuum tower.
One of the key constraints in the process was the product compositions of the gas, naphtha, kerosene, light diesel, diesel, atmospheric gas oil (AGO), low-vacuum gas oil (LVGO), and high-vacuum gas oil (HVGO) produced. The traditional method had been manual measurements that were drawn and sent to the lab once per day.
Chibuike’s team developed regression-based inferential sensors or virtual analyzers,
built with neural networks, to predict the product compositions in real time. An example of a virtual analyzer was one to predict the diesel pour point. These virtual analyzers perform inferential analysis using a regression based on product flow rates and distillation column temperatures. The predicted values are updated daily against the laboratory results to help keep the neural network models virtual analyzers tuned and making accurate predictions. The model predictive controllers use these predicted values as constraint variables to keep the products within specification limits.
Upon installation and post-audit, the throughput was increased to a level where the downstream units actually became the bottleneck. The quantifiable results were a payback within three months. This came from increasing production of more valuable products while reducing product giveaway and improving heater efficiency. The non-quantified benefits were reduced operator actions to maintain steady-state operations and improved response to disturbances such as crude oil composition changes.
Over the past several years, the controllers and virtual analyzers have been in continuous use. The refiner and Chibuike’s team have ongoing service agreements should immediate help or tweaks to the models need to be made. The models are robust and tolerant of inaccuracies to a certain degree and so long as no major process modifications are made, the models have not required refitting to the process dynamics.
Update: I wanted to give a note of clarification that the neural networks initially used were replaced by regression-based inferential analyzers due to insufficient historical data in the historian to properly train the neural networks. I’ve updated the text in the original story above.
Chibuike shared with me that the in country Emerson office provides the day-to-day ongoing support as required to keep this optimization project successful.