By Alan Williams
PC-based technology moving into soft logic’ (PC open control) in the process industries is very much in the news. Today, implementations remain the domain of hardware PLCs, but as PC systems have gained ever more acceptance in the SCADA role, is it time to consider expanding their remit?
Fact is that SCADA vendors seem to see the lucrative PLC market as an important growth area. To this end many have been developing soft logic products which extend the capability of their offerings ostensibly to replace PLCs. They could be successful; the extension of PC technology into applications and areas replacing bespoke hardware is well documented. But the question is: Should PC soft logic replace PLCs?’
Back-track for a minute; the first serious PC-based SCADA packages appeared around 1986. They used DOS, and many had proprietary real time extension kernels. Early packages were conceived as control systems some scaled-down DCSs. Examples like Intec’s Paragon and Intellution’s Fix were developed for control using function blocks.
Into the early 90s PC SCADA systems improved with faster, cheaper hardware, and GUI environments such as OS/2. But technologies were paralleled by PLCs becoming the preferred hardware run-time environment for control. So, in many areas PLC technology actually reduced PC SCADA’s role to that of super-MMIs. PC SCADA companies resisted this and continued to develop their systems notably with real time operating environments, GUI function block editors and batch- and MES-orientated packages. But the fact remains that Windows graphical MMI packages are still their best sellers.
So is history repeating itself? Or have the soft logic suppliers really addressed the technology issues of PCs for hard, real time control? Areas of immediate contention include the real time operating system and program development environment.
With the dominance of Microsoft and Windows NT, the former is now clear. But NT alone does not provide robustness for real time and hence the extensions to NT for control. Wizdom (Intellution) uses the VenturCom RTX API, while Steeplechase uses Radisys iRMX. The most obvious drawback is non-standard environments the opposite to the goal of mixing and matching best of breed applications.
As for the development environment you can see the scope for argument over the companies’ different program editors! Function Block Diagram and Flow Chart packages will be pitched one against another when the objective should be concurrent use through adopting, for example, the IEC 1131-3 standard five languages.
Actually, soft logic’s best contribution would be in providing a new object-based development environment with the hope then of true open portability. Remember, the big investment of any controls project is engineering time and application code development not the PLC hardware. But more of this later.
So what is the next step for soft logic? Will it in fact replace bespoke PLCs? I think not; as a PLC platform it has critical limitations. What can it replace? At the low microPLC end, cost and reliability of existing devices make PC solutions dead on arrival. At the other extreme are high end hybrid PLCs, DCSs and PCSs handling thousands of I/O with high levels of redundancy on networking, processors, PSUs, etc. Here, soft logic suppliers have already admitted that their technology may not be suitable.
This leaves the medium size modular rack PLC market but which elements of the system? Networking cards? I/O? industrialised rack? processor? PSU? Consider reliability, fit-for-purpose, etc; fact is only the processor could be replaced! Yet rack-based PC/PLC hardware technology has been around for several years. Development is on-going, but demand is not strong and reliability not near the high standards PLC users expect.
This point is critical; PC hardware and software producers aim their products at the mass market. So they’re engineered for acceptable’ reliability Windows NT and Microsoft Excel are not released bug-free. The larger the code, the more likely it is that bugs may surface during operation. NT has 7Mb of core code and resides in 150Mb of disk; it’s not unusual to see it lock up. And, if NT does crash, resulting in the Blue Screen of Death, what do you do?
What about MTBF and MTTR? If you have ever had a network fault on your PC, what could be the MTTR? 1 hour; 1 day; or maybe you’re still waiting for your MIS department?! Basic analysis of failure modes, MTBFs and MTTRs must be addressed when plant is at stake. Soft logic suppliers might point to deterministic real time control, but unless the MTTR is also deterministic system availability is questionable.
Meanwhile, most PLC solutions require distributed control. In many cases, although the PLC processor may not require redundancy, the network is expected to offer it and real time determinism. Yet standard PC Ethernets are non-deterministic and do not offer protection against nodes causing network congestion. Quite simply, PC technology has not been targeted at industrial applications.
Then there’s fault tolerance. For many process control applications full redundancy with bumpless transfer is required. Best current PC solutions use Windows NT clustering as implemented by DEC in its VAX/VMS business systems. With this approach a complete failure of a PC server would mean it’s clustered partner would pick up the additional work load. However, this involves a switch over time typically of 1-4 minutes.
So far so bad! Meanwhile, soft logic is supposed to help in providing a more open environment, preventing the end-user being held to ransom by PLC suppliers. But, the main issues here relate to the lack of portability of application code. If a programming standard was used (like the IEC 1131-3 set or C++) this would be solved. Incidentally, code portability problems apply to soft logic suppliers too with their proprietary development systems!
As for the technology itself, soft logic’s proponents seem to imply that PLCs are somehow old tech. But remember, while PC technology continues apace sacrificing reliability and robustness for the mass market good PC developments, both hardware and software, are continually being back-flushed into PLCs.
And here, it’s technology tempered to meet the requirements operating a boiler is different to producing a spreadsheet! PC based-control is applied to DCSs, but it’s integrated, with years of engineering effort applied to customise the hardware and software for robustness.
So, given the limitations of PC-based control, what benefits could it have? Fact is, the hardware replacement debate actually masks the contribution that soft logic is already making in automation. And it’s a fact that many PLC suppliers believe the adoption of the technology is essential for the traditional PLC market to evolve.
Fundamental issues need to be addressed in regard to reducing engineering development time, maintainability, portability, code reuse, etc. So the real contribution and impact of PC technology is in new object-based development tools (like those based on IEC1131-3) and connectivity (OPC, etc). Object-based technology for development tools is well established in general software engineering for large, complex real time systems. But, while PLC systems have grown more complex, their tools have remained relatively primitive.
The bottom line is that as a hardened run-time environment, PLC hardware has been a success story. However, for PLC technology to grow it must adopt more advanced engineering tools for development including soft logic. The prizes include: portability, maintainability, reuse of code, common development environment, and so on.
The ensuing portability will allow code developed to be executed on PCs and PLCs whichever is appropriate. Also, expect the use of OPC technology before the end of 1997, allowing for standard OPC servers interfacing to PLC hardware.
After the hype has died down, this will at last allow genuine mix and match’ combinations of PLCs and soft logic.