keeping up with technology
Widespread adoption of the PC over the past 20 years has given rise to a new way for scientists to measure and automate the world around them - virtual instrumentation. Today, virtual instrumentation is coming of age, with engineers and scientists using 'virtual instruments' in literally hundreds of thousands of applications around the globe, resulting in faster application development, higher quality products, and lower costs. A 'virtual instrument' consists of an industry-standard computer or workstation equipped with off-the-shelf application software, cost-effective hardware such as plug-in boards, and driver software - which together perform the functions of traditional instruments.
Although PC and integrated circuit technologies have experienced significant advances in the past two decades, it is software that makes it possible to build on this foundation to create virtual instruments. Engineers and scientists are no longer limited by traditional fixed-function instruments, but can now build measurement and automation systems that suit their needs exactly.
Moving to a more flexible, cost-effective system
Virtual instruments represent a fundamental shift from traditional hardware-centred instrumentation systems to software-centred systems that exploit the computing power, productivity, display, and connectivity capabilities of popular desktop computers and workstations.
Over the past two decades, PC performance has improved by a factor of 10,000 while their prices have decreased dramatically. These advances not only make virtual instrumentation a cheap and flexible solution, but also deliver productivity gains unmatched by stand-alone, proprietary systems. Traditional instruments such as stand-alone oscilloscopes and waveform generators are powerful, expensive, and designed to perform one or more specific tasks defined by the vendor. The user generally cannot extend or customise them.
The knobs and buttons on the instrument, the built-in circuitry, and the functions available to the user are specific to the nature of the instrument. In addition, special technology and costly components must be developed to build these instruments, making them very expensive and slow to adapt.
Because they are PC-based, virtual instruments inherently take advantage of the benefits from the latest technology incorporated into off-the-shelf PCs. These advances in technology and performance, which are quickly closing the gap between stand-alone instruments and PCs, include powerful processors such as the Pentium 4, and operating systems and technologies such as Microsoft Windows XP, .NET, and Apple's Mac OS X.
In research and design, engineers and scientists demand rapid development and prototyping capabilities. Using virtual instruments, they can develop a program, take measurements from an instrument to test a prototype, and analyse results in a fraction of the time required to build tests with traditional instruments. When flexibility is a requirement, a scalable open platform is essential - from the desktop, to embedded systems, to distributed networks. Engineers and scientists can easily modify or expand virtual instrumentation systems to adapt specific needs without having to replace the entire device.
This modularity is a result of a wide variety of low-cost, plug-in hardware available for virtual instrumentation systems. Data acquisition boards are ideal for a wide range of applications in the laboratory because they provide reliable measurements cheaply. From digital multimeters, to high-speed digitisers, to RF measurement devices, there is a wide range of modular computer-based devices that deliver data acquisition capabilities more cheaply than dedicated devices.
As integrated circuit technology advances, and off-the-shelf components become cheaper and more powerful, so do the boards that use them. With these advances in technology come an increase in data acquisition rates, measurement accuracy, and precision.
Software - the cornerstone of virtual instrumentation
Thomas Edison is widely recognised for innovations such as the phonograph and the incandescent light bulb, which he created working with small teams. Over the years, however, R&D has become the domain of large projects requiring hundreds, if not thousands, of engineers and scientists working together, often without fully understanding the complete scope of their project.
Only recently, with the emerging productivity advantages of virtual instrumentation, are we able to return to small teams. The power of the personal computer, the versatility of modular hardware, and productivity of off-the-shelf, engineering-focused software allow small groups of designers more creativity and, through collaborative engineering, to build advanced systems.
Virtual instrumentation has led to a simpler way of looking at measurement systems. Instead of using several stand-alone instruments for multiple measurement types and performing rudimentary analysis by hand, engineers and scientists now can quickly and cheaply create a system equipped with embedded analysis software and a single measurement device that has the capabilities of many instruments. Powerful, off-the-shelf software makes this possible. This software automates the entire process, delivering an easy way to acquire, analyse, and present data from a personal computer without sacrificing performance or functionality.
National Instruments' own virtual instrumentation software, LabView, for example, allows scientists to custom-design virtual instruments by means of the graphical user interface, through which they can operate the instrumentation program, control hardware, analyse acquired data, and display results. The similarity between standard flow charts and the data-flow programming nature of LabView shortens the learning curve associated with traditional, text-based languages.
Designed to create test, measurement, and control systems, the package can communicate with hundreds of devices through its built-in knowledge of hardware integration: incorporating ready-to-use libraries for integrating stand-alone instruments, data acquisition devices, motion control and vision products, GPIB/IEEE 488 and serial/RS-232 devices, and PLCs, among others.
It is also an open development environment. Standardisation of software relies greatly on its ability to work well with other software, measurement and control hardware, and open standards, which define interoperability between multiple vendors.
By selecting software that meets these criteria, scientists ensure that their company and applications take advantage of the products offered by several suppliers. In addition, conforming to open commercial standards reduces overall system cost. Many third-party software and hardware vendors develop and maintain LabView libraries and instrument drivers.
Virtual instrumentation requires comprehensive analysis and signal processing tools, because the application does not stop when the data is collected. While LabView can quickly process data a point at a time, it also integrates with another piece of NI software, Diadem, which is a configuration-based software environment that can manipulate sets of more than a billion data points.
In Diadem, a wizard shows engineers and scientists how to choose an appropriate layout and quickly generate a report. Scientists can correlate data from different channels, sources, and signals, then use a number of interactive tools, such as FFT analysis, for detailed visual examination of the data. Using cursors, scientists can analyse data point-by-point while opening interactive windows that zoom in and out of graphs for varying amounts of detail. They can automate standard tasks needed to create frequently used reports and charts.
Moving beyond the PC
Real-time and embedded control have long been the domain of specialised programs. Now, advances in industry-standard technologies, including more reliable operating systems, more powerful processors, and computer-based real-time engineering tools, are introducing new levels of control and determinism to virtual instrumentation.
This presents new opportunities for scientists to take on increasingly sophisticated real-time and embedded development. Software such as LabView Real-Time scales across development on the PC into development in real-time and embedded applications. Scientists can move into new application areas without a steep learning curve because the software itself evolves to incorporate emerging computer technologies.
In addition, the Internet has ushered in a new age of data sharing, and has spurred new networking and remote computing capabilities of virtual instrumentation that are simply not possible with their stand-alone proprietary counterparts. Virtual instrumentation takes advantage of the Internet so scientists can easily publish data to the Web directly from the measurement control device, and read data on a handheld personal digital assistant, or even on a mobile phone. Through virtual instrumentation, scientists can use the power of the Internet to control instruments remotely, or collaborate on projects with colleagues in separate offices or countries.
Advances in sensor technology also promise new developments in virtual instrumentation. A proposed sensor standard, IEEE P1451.4, defines new 'smart' analogue sensors that contain an embedded memory chip with standardised transducer electronic data sheets (TEDS) that store sensor information and parameters for self-identification and self-description. The sensors include serial digital links for accessing this information for plug and play operation, as well as analogue signals for backwards compatibility with traditional measurement systems. Using these smart sensors, scientists and engineers can take advantage of improved system configuration and diagnostics, reduced downtime, and improved sensor data management.
Already, software and hardware is available to evaluate, use, and develop technology based on this sensor standard. For example, the TEDS library for LabView, an online database of virtual instruments (VIs), makes it easy for engineers to integrate analogue smart sensors into their applications. The VIs implement basic TEDS management functions for reading and decoding TEDS sensors, and editing and recompiling TEDS data according to IEEE P1451.4 specifications. The NI Plug & Play Sensor Development Kit uses the TEDS library, combined with data acquisition and signal conditioning hardware, to create measurement systems that communicate with both the analogue and digital portions of smart TEDS sensors, read and manage TEDS data, and create and reprogram sensors.
Sensor vendors are also exploring how to expand plug and play capabilities to legacy sensors. Through a proposed online database of sensor vendors' model data, users could download TEDS binary files, or Virtual TEDS. With Virtual TEDS, they could take advantage of new sensor technology with their traditional computer-based hardware, providing a smooth transition to the next generation of measurement and automation systems. These advances in sensor and real-time technology are just the first wave of innovations that will drive virtual instrumentation in the future. Advances in commercial technology will continue driving virtual instrumentation to new heights.
The performance advances will be easier to implement, saving valuable development time and integration time, while reducing costs over traditional instrumentation solutions. No one can predict exactly where the future will take virtual instrumentation, but one thing is clear - the PC and its related technologies will be at the centre, and engineers and scientists will be more successful as a result.
Gricha Raether is LabView Senior Product Manager at National Instruments