Article - Issue 34, March 2008
gPROMS – pushing the barriers of process modelling
Costas Pantelides and Mark Matzopoulos
Schematic representation of a heat-integrated distillation column (HIDiC), capable of generating savings of 50% of distillation energy costs © Process Systems Enterprise Limited
The latest winner of the UK’s premier prize in engineering innovation, the MacRobert Award, is a spin-out company from Imperial College London. Costas Pantelides and Mark Matzopoulos, two of the key people behind the success of gPROMS, tell the story of how they have created a way in which engineers can harness physics, chemistry and engineering knowledge within a framework that solves highly complex mathematical problems.
In its ten year life our company, Process Systems Enterprise, has achieved success as a result of developing and applying effective software that can model processes extremely accurately. So accurately in fact, that eight of the world’s ten leading chemical companies have become users of the tool, named gPROMS (general-purpose PROcess Modelling System), to gain higher degrees of predictive accuracy.
Process industries, from traditional oil and gas, chemicals and petrochemicals, to new energy technologies, food processing and pharmaceuticals are experiencing increased demand for their products, especially in Asia. This is happening against a backdrop of decreasing availability of primary raw materials and global concerns relating to energy consumption, safety and environmental consequences.
A need to model
Accurate modelling provides the key information needed to design and operate processes in a way that maximises quality and minimises production costs and adverse environmental effects. It also saves raw material and energy, cuts pollution and makes production processes safer.
Such innovation, however, brings with it risks arising from uncertainties and gaps in available knowledge. To make rational and often rapid decisions, organisations need accurate quantification of the options open to them.
This is why many sectors of the process industries use process simulation software, typically based on the well-established steady-state flowsheeting approach which ‘black-box‘ models of chemical unit operations are linked to one another by material, heat and information streams.
Calculating with software
Models embed generic calculations with a fairly low level of detail and are used mainly to simulate steady-state operations – the nominal operating state of a plant with fixed feedstock, temperature and pressure conditions.
Such simulation methods have become an indispensable part of the process engineer’s tool-kit, but their use is limited to a small proportion of the actual, real-world operations of a manufacturing plant. Their main shortcoming is their limited ability to predict, resulting from the coarseness of the underlying physical understanding that they contain. Another drawback is their inability to incorporate new knowledge and to reflect exact process configurations. So their ability to generate competitive advantage is severely restricted, as is their ability to support innovation.
We designed gPROMS to address these major shortcomings.
Since we began work, our team has developed the leading example of what are now known as advanced process modelling environments. The key element of this is the way it combines first-principle mathematical representation (in the form of equations describing the underlying physical and chemical phenomena of a process) with actual observed laboratory or plant data. This vital combination of theory and real-world information provides its breadth of use and powerful predictive ability.
First-principle models capture knowledge from many of an organisation’s different disciplines such as R&D chemists, reaction specialists, chemical and control engineers and materials specialists in mathematical equation form.
Inevitably these data are complex. This complexity increases when we need to model, for example, the spatial distribution of chemical composition along the length and across the radius of a reactor tube; or the particle size or molecular weight distributions in a crystallisation or polymerisation process. Often we might need to factor in the effects of hydrodynamics, such as the eddying that occurs around inlet nozzles or stirred-vessel impellers. There are also, of course, plants that need to be started or switched from one state of operation to another, all of which introduces the dynamic element of things changing over time.
As someone once said: “It’s not rocket science…it’s much more complex than that”!
Millions upon billions
The more fundamental details like these that we include in a model, the closer that model will come to a true representation of what is happening inside the process at any time. For example, advanced process modelling gains a unique insight into the design and operation of crystallisation processes – an area traditionally fraught with challenges and where until recently only limited simulation and modelling capabilities have been available, particularly when trying to predict scale-up and the effects of large operational changes.
The problem is that models can quickly become vast in their extent of equations. A detailed predictive model of a polymerisation process, for instance, can run to millions of equations, all of which need to be solved simultaneously for every step of the process – within minutes rather than days.
This ability to solve gigantic mathematical problems quickly and robustly was a key element of our original research at Imperial College London that
led to the creation of gPROMS.
Even fundamental relationships can contain values that can be gained only by experience or experiment. For example, there are heat transfer coefficients and reaction kinetic constants. These values, typically termed model parameters, usually have to be determined from real-world data (laboratory experiments or pilot plant or operating data) as part of a process known as model validation.
One of gPROMS’ unique features is that it can estimate multiple parameters involving tens or even hundreds of thousands of nonlinear equations using data from dozens of experiments. Our estimation techniques even produce estimates of error behaviour of the measurement instruments, while giving quantitative measures of the degree of confidence associated with each parameter value.
One of many process sectors where these capabilities are paying dividends is the chemical industry, where vapour-liquid and liquid-liquid separation are fundamental. Distillation alone can account for around 40% of this industry’s energy use. In spite of this, the modelling of separation processes remains relatively unsophisticated and current simulation software caters for it poorly. Our tools, libraries and services overcome these limitations.
Experimental data plays a vital role in improving models. Less well understood, but equally crucial, is the fact that models can be used to improve the quality of experiments by enhancing the information content of the results produced by each experiment.
This use of model-based techniques in designing experiments is a major recent development. Unlike the usual statistically-based approach, model-based experiment design takes full advantage of information that is already available, in the form of the mathematical model, to design experiments that can yield the maximum possible information about the system being studied.
Model-based experiment design provides clear and practical guidance on key procedural aspects of the experiment, such as the optimal temperature profile to be followed for the duration of the experiment or the optimal initial charges and reactant addition profiles, plus the times at which measurements (for example taking samples for analysis) should be made.
Managing the risks
For engineers, the scope and capability of the new generation of modelling tools such as gPROMS provides a great step forward in the ability to innovate and to manage the risks associated with process plant design. These tools can build and solve extremely complex mathematical models which means they are able to capture process behaviour in a truly predictive way.
Take the area of heat and mass transfer as an example. Falling film evaporators, condensers and reactors transfer heat and mass over a continuous transfer surface. Operation, which usually involves heat-sensitive foodstuffs or pharmaceutical materials, must stay within a very tightly-defined envelope to avoid spoiling the product or creating dry zones. gPROMS is the only tool capable of dealing easily with the complex interaction of hydrodynamics, heat and mass transfer and reaction kinetics.
For those of us behind the development of gPROMS, the key challenge has been to take what are, inevitably, complex concepts in modelling and mathematics and make them available to industry. Meeting this challenge by innovating and delivering an intuitive, easy-to-use modelling tool has proved crucial to our success.
gPROMS is now applied by major process organisations throughout the world to provide high-quality information for decision support in product and process innovation, design and operation. In addition, there are also some 200 academic institutions worldwide using gPROMS for research and teaching, ensuring that gPROMS is constantly being applied to the most challenging applications in new areas of research.
Biographies – Costas Pantelides and Mark Matzopoulos
Professor Costas Pantelides is Managing Director of PSE and Professor of Chemical Engineering at Imperial College London. He has been instrumental in numerous key developments in process and enterprise modelling technology over the last three decades.
Mark Matzopoulos is Marketing Director of PSE and responsible for PSE’s international operations. He is a Chemical Engineer with 25 years’ experience in the application of simulation and modelling to process industry applications.
The authors would like to thank John Hutchinson for his help in preparing this article.