One petrochemical company was having capacity limitations at its naphtha cracker furnace that makes ethylene and propylene. The furnace's unstable production rate and low overall output meant that it represented a serious bottleneck for a high‐margin segment of output. An artificial neural network and genetic programing‐based furnace model building exercise was carried out. All the temperature, pressure, flow, and composition data related to the furnace had been collected over 3 years of production, comprising 900 000 samples, each with 360 tags – almost 80 million data points. This analysis identified critical process parameters and made it possible to build a data‐driven furnace model. The model generated mathematical relations between the furnace throughput and other input parameters that can influence the throughput. By running the models at different input conditions, a deeper understanding of furnace operations is generated. A test run of the furnace confirmed the model's findings. The plant engineer had long suspected that manipulating some of the levers identified in the model could improve productivity, but they never had the mathematical tools or data to confirm it. Based on its new advanced analytics‐based understanding of its process, the company developed an automated, real‐time‐based operator guidance platform that advises the operators how to adjust a range of process parameters to get the best performance. The result was an output increase of 10 to 15%, which represented a net profit‐contribution increase of around USD 15 million a year. The company estimates that applying the same kind of advanced‐analytics approach across all the different manufacturing operations at the site could generate a USD 75 million annual profit gain (Holger Hürtgen, 2018).
Advance analytics can also be used to develop a data analytic platform to show a companywide energy usage pattern and guide the operators to change various energy usage options to minimize energy consumption. AI‐based techniques are sometimes use for a simple process, such as how to exploit favorable seasonal conditions (say a less cooling water temperature) to improve plant profitability.
2.4.3 Optimizing the Whole Production Process
Whereas predictive maintenance and yield, energy, and throughput analyses are designed to improve the efficiency and profit‐making capability of individual pieces of equipment, value‐maximization modeling covers the whole plant or whole site and helps to optimize the interaction between those pieces of equipment across processes. This optimization and modeling technique utilizes its inherent analytic capability to show in real time how to maximize the rate of profit generation in complex production systems and supply chains, encompassing every step from purchasing to production to sales. Unlike the limitations of human planners, this advanced‐analytics approach typically solves the complex maze comprising as many as 10 000 variables and one million constraints to help producers figure out what to buy and when, what to make and how much, and how they should make it to generate maximum profit in each period (Wang, 1999).
The uncertainty in pricing and demands of turbulent chemical markets poses a complex business challenge, which needs to be solved every day to figure out the most optimum buy and sell decision and also how much to produce. The uncertain and frequently changing nature of chemical companies' businesses and product lines means they must be capable of solving a complex objective function: volatile costs and prices, multiple plants, and products that can be made in various ways from diverse combinations of materials, involving output of different combinations of co‐product of varying values, maximizing output of a high‐value product, as well as managing by‐product flows.
The following example from one large, diversified integrated refinery and petrochemical complex shows the kinds of gains to be captured. The company was selling a broad range of petrochemicals and specialty chemicals from the site to a global marketplace through a mixture of spot and long‐term contracts. On the other hand, it was buying the raw material, i.e. crude oil, from various countries with varying quality and price.
Being a multinational company with a presence in different countries, purchase, sales, and production decisions were made by local offices and pricing was arbitrarily set by different regions and departments. Organizational responsibilities were scattered across multiple business units and corporate functions. Underlying all this was the typical chemical‐industry challenge of commodity products underpinning specialties production, while the commodity output brought with it lower‐value co‐products, multiplying the hurdles to maximizing profitability. Due to the absence of a global optimization algorithm, the company lost a lot of money due to non‐optimal decisions that were taken locally. A mixed‐integer programming model encompassing the 900 variables explored nonlinear cost curves and the 4000 constraints related to production capacities, transportation, and contracts; the hundreds of steps in production with alternative routes and feedback loops; nonlinear price curves and raw‐materials cost structures; and intermediate inventories (Wang, 1999).
Using the model, the team solved a global optimization problem and were able to increase profits by USD 20 million a year (Wang, 1999). For example, the company started making an intermediate product on an underused line instead of buying it from a third party. At the same time, the team optimized different process parameters of a furnace, various distillation columns, an absorber, etc., which gave higher yields, thereby reducing raw‐material consumption. It identified some extra cushions available in some of its plant to expand capacity by increasing the throughput, and it increased sales revenues by raising the capacity for some product categories. It also maximized the production of some of the products that fetched a higher profit margin.
The analytics approach revealed some counterintuitive improvements. The model suggested that eliminating the production of a particular polymer grade would increase profitability overall. The company had been selling this lower‐grade polymer to a local customer for a long time, but generated limited returns while incurring high logistical costs. By shifting the raw material, i.e. ethylene, used to make this polymer, to manufacturing another value‐added product, the company was able to make more profit. That switch might never have been suggested if the decision had been left to the manager of the polymer business, who previously had the decision rights.
These changes enabled the chemical company to boost its earnings before interest and taxes by more than 50%.
2.5 Achieving Business Impact with Data
For the last two decades chemical industries have been generating, collecting, and storing huge amounts of operation and maintenance data using various software. These data are like a gold mine and now is the best time to achieve an impact with (your) data. More and more data are available, computing power is ever increasing, and mathematical techniques and the so‐called data science are becoming more and more advanced. Yet while data is considered as the “new oil” or the “new gold” these days, several technology‐ and business‐related challenges prevent chemical industries from realizing the potential impact data can make on their business (Holger Hürtgen, 2018).
2.5.1 Data's Exponential Growing Importance in Value Creation
The following facts regarding data have changed the business outlook in recent times:
Rapid increase in data volume: The number of delivered sensors globally has increased sevenfold from 4 billion in 2012 to greater than 30 billion in 2016 (Mckinsey). Data has not only increased exponentially in volume but has also gained tremendous richness and diversity. In the chemical industry, data is not only generated from various flow, temperature, and pressure transmitters but also from cameras and analyzer to vibration monitors, enabling richer insights into process behavior.
Falling IoT sensor price: There was a 50% reduction in IoT sensor price between 2015 and 2020.
Cheap computational power: Better processors and graphics processing units increased investment in massive computing clusters, often accessed as cloud services, improvements in storage memory, etc., have recently increased computational power.