Avoiding the hidden costs

Richard Mathewson of Taviz looks at some of the pitfalls for manufacturers contemplating enterprise application integration (EAI).

EAI (Enterprise Application Integration) has replaced ERP as the acronym of choice among many manufacturers, and the reason is simple.

The advent of e-business means fast and efficient information is vital for success, a competitive edge in a homogenised world. Manufacturers must ensure that their systems are able to link to private trading exchanges and that they can take advantage of new developments in supply chain optimisation. To do that data must be available whenever and however it is needed.

Of course integration means selecting the right mix of clients and servers, identifying applications to buy and which to build, and numerous other complex decisions.

This can prove a costly mistake. Data integration, if done badly or as an afterthought, can add considerable expense and significantly delay the implementation of new systems. Just because the initial costs of integration are not great, this is no reason to overlook its implications. Whether you are converting to an enterprise environment or are planning a new thin client approach, avoiding the hidden costs of data integration requires the right tools and an effective plan.

There are four steps to consider. To begin with, you should factor data integration into the overall project plan, giving it the same priority as hardware and software decisions – anything else, and you can ignore that impressive-looking project timeline. Having allowed for data integration planning, however, each step in the integration process should be properly mapped, and an integration team appointed.This initial planning done, you have already dodged most of the potential dangers.

But there is still work to be done. The integration team now needs to be properly supported, and given the tools necessary to do the job effectively.

Data integration should be made part of the overall enterprise infrastructure, and given the same quality assurance, continuing care and resources as any other part of the infrastructure.

Ongoing quality assurance is vital, as many of the biggest pitfalls in data integration are a result of human error. One of the most common is underestimating the time involved, a failure to realise that in some cases complex data projects can take years, rather than months, to complete. The potential for input error is also high.

Using SQL programming, for example, means writing, debugging and rerunning code – creating another level of data errors which are then difficult to subsequently detect.Inadequate source or destination documentation will also create delays, as the project team analyses and reverse engineers the database in an effort to find out what is going on – often at the same time as it wrestles to understand a complex new relational data model. If programmers don’t fully understand the new application, how can you know your legacy data is going to the right place?

On the web