Refining operations

With remote monitoring and control via the web becoming more common, it could be the way forward for chemical, pharmaceutical and other process industries to keep tabs on operations. Stuart Nathan reports.

It’s almost impossible to overstate the impact of the internet on the way we manage and process information, and on how this affects our daily lives.

It has taken over from the library as the first port of call for researchers; it has supplemented and sometimes superseded the letter and the telephone call. It is even becoming the main way we shop. But industry, particularly the process sector, has been slightly slower to appreciate its full potential.

Remote monitoring and control is becoming increasingly common in the process sectors. So could it be the way forward for chemical, pharmaceutical and other process industries to keep tabs on their operations – or even control them?

The advantages are clear – with conventional systems, plant operators and engineers can only see the output of their instruments and alter process conditions while they are in the plant control room, or out on the site. A remote function would allow them to connect via a web browser to see the status of the plant. In this way senior process engineers could troubleshoot and diagnose faults from miles away.

It is very easy to integrate a web server into a piece of equipment. In many cases they are already in place. Distributed control systems (DCSs), most often used in major petrochemical plants and refineries, split the control function over several computers, each running a different part of the plant. To ensure all the computers communicate with one other, they are increasingly using intranets running TCP/IP, the communications protocol used by the internet. With this system in place, it’s not difficult to extend the system to supply data to an outside computer.

Internet monitoring takes the established technique of DCSs one stage further. Rather than sending data over the IT network of a single process plant, it uses the web to relay information to operators who are miles from the plant; to the manufacturers of the various pieces of equipment, to assist in maintenance or troubleshooting; or to external process optimization consultants.

But it needn’t stop with large-scale systems. The smaller controllers used for low-capacity continuous and batch-scale processes, programmable logic controllers (PLCs) and even the individual process instruments, such as pressure and temperature transmitters, now incorporate more computing capacity than the Apollo moon landings. Adding a web server is easy. The problem is in how to use them.

Using the internet to allow vendors to help customers is an established trend. Often, instrumentation manufacturers use a web link to receive data from their products, which can then be used for troubleshooting or reconfiguring them. This type of service has advantages for both parties: it cuts suppliers’ costs and speeds up customer service. And as the information network used to transmit and receive this data is the internet – stable, established, cheap – using familiar, friendly web browsers as the user interface, it seems that everyone wins.

One of the pioneers in this field was process control specialist Foxboro, part of the Invensys group. Its Remote Plus monitoring system harnesses the main advantage of web-based monitoring – it uses existing technology and equipment to provide customer support.

The company runs Remote Plus, from three sites, in north America, Europe and the Far East. As the service involves sending sensitive process details to a third party, much of its emphasis is on security – the TCP/IP protocol is backed up by secured routers and firewalls, and customers must authorise all transmissions. The data is also encrypted during transmission.

The next extension of this philosophy is to extend it to other types of equipment, and to cover the process as a whole. Emerson, for example, launched its internet-based monitoring system, AMS Suite, last October. The system, already in use on high-profile operations such as BP’s Coryton refinery and the Grangemouth petrochemicals complex, uses internet protocols to bring together plant information to co-ordinate maintenance and improve plant performance.

Maintenance co-ordination is a major advantage of remote monitoring. It allows operators to consolidate the data produced by their plant equipment and compare it with information it would produce under ideal conditions, and makes the data available wherever the authorised users have access to the internet.

The AMS system collects and submits data directly to Emerson, where it isvalidated for consistency and accuracy. It is then compared with a design model so that operators can check the performance of their systems against their original design specifications, and work out how much any deviation from this perform-ance is costing them.

At Coryton, for example, BP uses the AMS ‘e-fficiency’ system to monitor the performance of compressors and turbines on the refinery’s lube leg unit which produces lubricating oils. This uses a secure internet server to relay equipment performance back to plant operators so they can spot performance degradation and prevent breakdowns before they happen.

BP anticipates it can save over $3m (£1.6m) a year from identifying poor equipment performance and fixing it before it affects production, and $1.5m from optimizing maintenance scheduling which reduces downtime. ‘We’ll be able to keep equipment performing at design levels and improve its efficiency and reliability,’ said BP mechanical reliability engineer Jerry Whittaker.

Another oil company, Amerada Hess, uses the system to oversee its Triton floating production storage and offloading (FPSO) operations in the North Sea. ‘We need to maximise production from the fields, and that calls for us to know how the equipment is performing within the critical processes,’ said David Stewart, Triton support services manager for Wood Group Engineering, which provides support services for Amerada Hess.

‘We see efficiency as a means of gathering the pertinent information via the web and working with on-site operators and engineers to manage the equipment performance.’

Smaller companies also offer this sort of service. For example, iSagacity in the US uses pattern recognition software to spot deviations from normal process operations. Like many internet monitoring consultants, it specialises in certain process operations, with specific services to monitor the operations of chillers, boilers, and cooling towers.

Once again, there is a logical next step. Receiving systems information over the internet is a simple matter, and in computing terms, sending instructions is also easy. But using that data to control a process is not at all easy, and is currently attracting the attention of academic researchers.

Internet control could be a valuable tool in many industries. It could allow processes located in remote areas to be kept under surveillance and under control. Experts could also supervise difficult processes, or keep an eye on less experienced operators without being on site. In short, it would free plant operators from the control room, and allow plants to be operated from anywhere there is an internet connection.

In a recent project computer scientists and chemical engineers from Loughborough University joined forces to tackle time lag and security.

Researcher Shuanghau Yang said that time delay is a critical factor for control over the internet. ‘In contrast to a typical DCS, where the system load has been designed from inception, web-based process control systems have a variable working load,’ he said. This means that the time between sending and receiving a signal is not determined by the control system itself. Several other factors have to be taken into account – and they are not predictable.

There are four separate stages in the time lag between an operator deciding to send a signal and the time he’s told the signal has been received, said Yang. There’s the time the operator actually takes to hit the required buttons to send the signal; the time it takes to transmit the signal from the remote operator into the local web server; the execution time of the server to perform the action, and the delay in relaying the result of the action back to the operator. The first and third of these are effectively fixed by the reaction time of the operator and the processing speed of the control system.

But the two internet legs depend on a number of nodes the signal has to travel through; the processing speed of these nodes; the connection bandwidth; the length of the link; the amount of traffic on the network; and several other factors that can vary from second to second, even on a fixed system. So the total time lag is impossible to model. What’s needed, said Yang, is a control architecture which is insensitive to time delay.

Yang’s project focused on producing a systematic way to design internet-based control systems. Working with both computer simulations and a teaching rig consisting of three linked water tanks, Yang and his team believe they have designed a system that can control and monitor both batch and continuous processes.The first step Yang took was to ‘decompose’ the system requirements into three levels — one for plant-wide optimization; one for supervising the process; and one for the regulatory level. Each level was then be divided further into tasks which can be achieved on the internet, and those which can’t.

One way to tackle the time-delay problem is a system Yang calls virtual supervision parameter control (VSPC). This delegates detailed control functions to the local control system, but allows certain data, such as new setpoints for process parameters, and parameters for proportional-integral-derivative (PID) – a system of control which depends on the rate of change of a process variable – to be sent to the system over the web. Once this has been received it is locked in and used until the next set of data is received.

VSPC gets around the problem of concurrent access by assigning priorities to the users. High-priority operators can overwrite data from lower-priority ones, but not vice-versa. After a new command is accepted, the system will be blocked for a certain time during which it will not accept inputs from any users with an equal or lower priority to the last input, ensuring that the commands have been fully executed before any new data can be sent. Yang described VSPC as a ‘better safe than sorry’ system – if anything goes wrong local operators on-site can set the system to ignore any commands from outside.

Safety considerations are another problem. The internet is vulnerable to attack by hackers; moreover, the constraints of the internet itself can lead to failures caused, unwittingly, by authorised users. Yang’s team has designed a framework to analyse and react to the security hazard, with various levels indicating what actions a hacker might take, how the process control system itself might react to them, and what effects these actions might have.

For example, if the firewall detects an intrusion and blocks it, there is no effect on plant operation. If the hacker gains access to the controller, the system can detect this, disconnect the external network, and continue operating normally but without accepting internet controls. Even if the hacker can alter process conditions, thesystem’s inbuilt safety features can detect and react to the abnormal process conditions and return the plant to a safe operating condition.

Yang sees the future for his system in controlling processes where various parts of the system are geographically remote, such as small-scale hydro-electric power stations, wind farms, and food storage warehouses, where temperature and humidity control is crucial. ‘We are applying to the Chinese government and the Royal Society for funding to implement our research results for China’s Zhejinag province for their thousand-plus small-scale hydroelectric power stations,’ he said.

Internet control of less-critical processes is now a reality. Pfizer, for example, uses a Plexus system supplied by MAC Solutions for monitoring and managing its building control systems at its R&D and process development site in Sandwich, Kent.

Although the prospect of chemical engineers controlling a distillation column by laptop from home may be some years away, as internet use becomes ever more widespread and the speed of data processing increases, the barriers to its application are tumbling – fast.