Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

The virtual approach to hardware and software

Share this on social media:

Separating the software from the hardware components can boost the speed and flexibility of instrumentation, discovers Sian Harris

Mobile phone technology changes frequently, which poses major challenges for manufacturers. They need to ensure that their phones work properly with each new standard while meeting the price and time-to-market expectations of their customers.

The traditional solution to ensuring the quality of such devices is to buy dedicated test equipment for each new telecoms standard, but this becomes expensive when standards change so frequently. What’s more, waiting for test manufacturers to develop new test equipment for a new standard can hold up the process of getting new phones to market.

However, some mobile phone manufacturers are becoming interested in an alternative approach, according to Ian Bell of National Instruments. ‘Test equipment hardware generally operates up to radio frequencies of 3GHz – a range that covers all the cellular telecoms standards and most of the other wireless technologies currently included in phones,’ he says, explaining that it is the test equipment’s internal software that determines with which standard the instrument works.

The alternative, which National Instruments has been promoting for the past 20 years, is to separate these instrumentation components: instead of using the proprietary software to control the experimental setup, data acquisition and analysis, users could turn to third-party software on a computer platform that suits the user. This approach is called virtual instrumentation (VI). ‘VI is a software-defined system where user requirements dictate modular hardware,’ explains Bell. ‘VI techniques are able to improve product quality and we use these techniques ourselves in our own product design.’

So why does the VI approach bring benefits? One of the main reasons is that everybody tends to use instruments differently; so, one person might need an oscilloscope, for example, to record a signal that is changing very rapidly, while another might be tracking something that changes very slowly, but requires greater sensitivity. Another user might then need to do a Fast Fourier Transform (FFT) on the signal to observe patterns. The software and hardware in the instrument might offer a limited set of options, so different equipment might be required for each application.

‘Traditional benchtop instruments are the lowest common denominator in one box,’ points out Bell. ‘Many modern instruments have a PC at their heart anyway. We let customers choose their PC and add their own modular hardware components such as analogue-to-digital converters.’

Bringing hardware together

The advantages of this go beyond replicating the way that a single traditional instrument performs its tasks. ‘It doesn’t just mimic the individual functions of traditional instruments, but collects these functions together,’ says Bell. VI can bring both modular instruments, and traditional instruments connected through buses together in one system, which means that one system could be monitoring several things in an experiment at the same time. The advantages of this are obvious, according to Bell. ‘The VI approach works faster than the traditional approach so it can test faster or test more of it in the same time,’ he says.

NI’s approach to virtual instrumentation is through LabVIEW. This is a graphical programming language, which Bell says gives users speed, flexibility and control. An obvious application area for this is in test and measurement, but it is has other uses.


Coleman Technologies uses virtual instrumentation from National Instruments in a drug discovery process

‘Our initial focus with this idea of VI was what we’d class as test, but it can be used anywhere where measurements are done and require some level of automation,’ he points out. Newer application areas for the idea of software-defined systems include control systems and system design. ‘With this concept you could move from PCs to industrial ruggedised hardware for control in chemical plants, for example, or to rapid prototyping of products,’ says Bell.

An example is how VI is being used is in the drug-discovery process. A major pharmaceutical company contacted one of National Instruments’ partners to find a way to automate the identification of protein crystals that can be studied by X-ray diffraction studies. The solution was to use LabVIEW software with image acquisition and motion control hardware.

Another VI application is vehicle design. A team from Virginia Polytechnic Institute and State University in the USA used code developed in LabVIEW, along with sensor products, to perform GPS navigation, obstacle avoidance and road following in two cars that it made for a recent US competition.

Financial benefits

For Measurement Computing, one of the key advantages of virtual instrumentation is cost. This independently-run subsidiary of National Instruments has a different approach to its parent company. It was founded in 1999 as a spin-out of a data acquisition computing company. ‘We saw a market for PC-based data acquisition,’ explains Bill Kennedy, vice-president for sales and marketing. ‘The cost of data acquisition was rising with the market but the cost of PCs was lowering. We wanted to make sure that we have software for all budgets.’

With Measurement Computing’s VI software, such as DASYLab, the emphasis is on being able to use the tools out of the box without needing to do any programming. ‘We have a lower-cost, easy-to-use solution for end users who may not have the money or knowledge base to use more complicated platforms,’ says sales engineer Rob Segnatelli.

Nonetheless, some customers, particularly OEMs, do like to customise their tools to specific customers, according to the company. One example of this is from a company called Immersion Medical, which makes simulators that allow healthcare providers to practice procedures in an environment that poses no immediate risks to patients and avoids animal use. These are traditionally based on very expensive instrumentation, but that approach does not scale well when the systems become more complicated.

It also becomes very expensive. The solution was to use one of Measurement Computing’s low-level drivers to write a C program and create a common, multi-function data acquisition system that provides scalability so that channels can be added easily. According to Immersion Medical, this approach saved six man-months on the project.

Beauty in simplicity

Data Translation sees simplicity as one of the most important features of its virtual instrumentation platform, Measure Foundry. ‘Virtual instrumentation means that customers can concentrate on visualisation and the tests that they want to do without having to worry about programming,’ says Winfried Klass, who is vice-president of international sales at Data Translation. ‘They can just drag and drop.’

‘Some products have left the programming for the user to see but with Measure Foundry the user sees a very clean, user-friendly interface,’ adds Neil Chapman, the company’s sales manager. ‘With some products, users have to physically connect wires to each other, but in our approach connections are implicit, making it easier to use and to return to in a month’s time.’

Chapman explains that the software comes with a library of applications and that these examples can be modified and saved to suit the user’s requirements. For example, there is an oscilloscope template included, but this could be altered to include four input channels rather than one.

Pushing the limits

All these flexibility, speed, cost and simplicity advantages sound ideal, but inevitably there are some limits to the applications that can experience such benefits.

As Chapman of Data Translation explains, ‘It’s not necessarily a question of competing with hardware makers. The oscilloscope is only one bit of a test and measurement system.’ And his colleague Winfried Klass continues: ‘We have an FFT converter in Measure Foundry, but I can’t say that it can do all that the FFT of a dedicated, expensive instrument could do. It isn’t possible to do what a high-end oscilloscope does just with data acquisition modules and software, because they work at much higher frequencies.’

According to Measurement Computing, benchtop instruments are becoming much more ‘high end’ with very high-speed acquisition rates that today’s computers could not cope with. ‘In the GHz and above range, with RF and microwave applications, there is not much out there for VI,’ says Rob Segnatelli, the company’s sales engineer.


A virtual instrumentation solution from Measurement Computing

Programming the hardware

But even the very high-speed applications may soon be achievable by VI, according to National Instrument’s Bell. ‘The future is software-defined programming of the hardware itself,’ he predicts. ‘Reconfigurable systems are typically very much faster than CPUs. They also can’t be disturbed and can’t crash because of Windows.’

And this is an application area that the company is already seeing emerge. For example, OptiMedica in the USA has used this approach for improving treatment of sight problems in diabetics. Previous systems have involved manually locating points on the retina to treat with lasers, but this is a lengthy and painful process for the patient. As speed is an important issue in reducing the discomfort to the patient, the company turned to the LabVIEW field programmable gate array (FPGA) to create custom hardware to control its new system for burning patterns of the eye at the same time so that the treatment is quicker.

‘With a single graphical development platform, we were able to design and prototype the machine quickly and efficiently using customisable off-the-shelf PXI hardware and successfully demonstrate the system to potential investors,’ explains Michael Wiltberger of OptiMedica. ‘Using an FPGA in this application provides the reliability of a hardware solution, which does not require the same level of code reviews as processor-based systems when obtaining FDA approval. The decision to use programmable silicon as opposed to a fixed ASIC chip also reduced our development time by 30 per cent.’

And FPGAs are not the only change that could boost the power of virtual instrumentation. Bell also points out that the new trend towards multi-core processors in PCs has the potential to boost their power still further if the software is able to work with multiple processors. ‘LabVIEW has the ability to enable users to create applications that can do more than one thing at once,’ he says.

With so many opportunities on the horizon for virtual instrumentation, many more markets are likely to adopt this approach as time goes on.