Life after LabVIEW

Tim Dehne, vice president of engineering at National Instruments discusses with Roger Brownlie the past, present, and future of technology.

Tim Dehne should know a few things about engineering. He received his BS in Electrical Engineering from Rice University, Houston, Texas and then joined National Instruments. Currently, he is VP of engineering. Roger Brownlie, editor of Design Engineering, went for a Q and A session to find out what he could.

RB: What are the main changes you have seen in the remote measurement and automation industry?

TD: The internet has made access to data and distributed control possible for everyone in the organisation. The roles of design engineers around the world have become linked. For example we have R&D centres in China, Germany, and India as well as the majority of R&D in Austin, Texas. Without the communications infrastructure of the internet it wouldn’t really be plausible to have those groups working on similar projects. But now they can work as part of a global team. From our customers stand point we have to find ways to make that transparent and make it high-performance so that they can decide where they want to see user interface, where they want to do number crunching, where they want to do the measurements and acquisition, and have them all tied together seamlessly. So looking through their browser they see everything they want to see.

RB: What have been the main drivers of the industry?

TD: Being a measurement and automation company, the whole communications revolution in the general market place – mobiles, fibre optics, pagers, set-top-boxes, cable modems – provides an investment stream because of the size of the market world wide to improve technology. So vendors of such products as analogue devices, linear technology, have to produce the d-d converters, the d-a converters, and power supplies for these devices. And we can leverage all those into our measurement products.

RB: What are the limiting factors to technical improvement?

TD: One technology that is very exciting for us is FPGA (field programmable gate arrays) in how dense and how low cost they’ve become. This provides a high speed mechanism to do reconfigurable computing. So you’re reconfiguring on the fly your measuring needs by reprogramming your hardware to do acquisition or analysis functions.

FPGA is an enabler so I don’t want to downplay it so the faster they can go, the more gates, the lower the cost, the lower the power, the more we would like to do with them. So FPGA is both an enabler and provides some sort of boundary that we can operate in.

The alternatives that the traditional measurement companies have gone to is always their own ASICS (Application Specific Integrated Circuits) and their own design and development. That gets you a certain point on the performance curve but it comes at the price of time, lack of flexibility and cost. Our model is based on leveraging commercial technology like FPGA, the internet, the PC, PCI bus, and even Pentium processors which can outperform DSP processors today.

So we use things that have a much quicker development cycle than traditional box instruments. Dell or Nokia can’t afford to miss a window or they’re out of the game. They move at an inherently faster pace than proprietory designs and development cycles of measurement boxes.

As a result we are limited by measurement frequencies that we can’t pick up. In those cases our strategy is to be able to hook up to those boxes with some proprietary designs. And the thing that pulls that together is our software.

RB: How much of your intellectual property is in software and how much is in hardware?

TD: We were a pioneer in software patents. A substantial amount today is in our software. But we have, over the years, developed a dozen or more ASICS to handle mostly the digital processing and computer interfacing that’s required to get performance at a lower cost. We left the analogue and digital front ends to the off-the-shelf-components. You can’t really distinguish between the hardware and the software. The two have to work hand-in-hand. The tight integration of hardware and software is what makes a user successful, it lowers cost, and lowers time to market.

As opposed to taking C++, starting there and building it yourself. You can do that with our products, you can do it on your own, but things like LabVIEW allow you to do that a lot faster.

RB: The role of the PC has changed with the use of smaller more mobile computers. How will National Instruments change with these times?

TD: This is the fundamental engine for our acquisition and analysis presentation. It’s part of our strategy to roll with these changes. Just the connectivity options like USB (Universal Serial Bus) have been a great boon to ease of use of computers. This is an example of a changing in standards that we have to adapt to. The move towards smaller interfaces presents challenges. This is one of the things that FPGAs can do for you – reduce size and increase functionality.

Typically when you are automating and you have an iterative processor intensive application, buying a desktop is still the way to go. When you want to get more interactive with spot measurements and diagnostics that’s where hand-helds are useful. So there are different modes of use. The PC is not going to go away. It basically becomes the server for all those devices – pagers and PDAs. When things like Bluetooth come to reality, they still need a central house because of PC’s storage, processing, display and connectivity capabilities.

RB: What’s going on in your R&D centres?

TD: In Germany we have a product called DIAdem which is a higher level software package to manage data that you acquire, analyse the data, and store it. China has competencies in signal processing and looks after some of our communications products. India is just getting started and helping out with our IV instrument driver standard as well as some signal processing.

RB: Is there life after LabVIEW?

TD: We have expanded LabVIEW in several directions and we believe there is a few more to go. The real time dimension is one direction which is a recent introduction to allow the LabVIEW development environment to be targeted at an Intel processor.

We have also had demonstrations of LabVIEW running and programming FPGAs – dynamically configuring hardware. We’ve added levels of LabVIEW to handle high channel count and large scale applications. When you get thousands of points from a team of twenty developers it becomes more of a configuration challenge than a programming challenge. The core architecture of LabVIEW has by no means run out of fuel.