Unscrambler X

Since Unscrambler last passed this way, then in version 9, its Norwegian publisher Camo Software has launched a 'new generation' image. At its heart is Unscrambler X (currently 10.1, the third sub-version in numerical release terms; some limitations in the original 10.0.0 have been resolved at this point), with additional products to extend its reach and power in particular directions. My review is based a month’s use of the 64-bit option, although a quick run through on a one gigabyte Windows XP system showed the 32-bit alternative to be completely happy there. I’ll combine a quick overview, for those who’ve not encountered Unscrambler in its previous incarnations, with a skim of what’s new.

There is a marketing emphasis on chemometrics, and I explored the software through a clays elemental composition matrix, but the approach is completely general and applicable to any discipline. Colleagues tried out my review copy on archive studies in manufacturing faults, infection transmission, language mutation and stressed environment populations with equal success.

The result is a different user interface philosophy from the usual market pattern. This starts in the menu bar, where in place of the familiar 'Statistics' (or similar) is a 'Tasks' heading, below which drop down 'Transform', 'Analyze' and 'Predict' options. Plotting, on the other hand, is instantly familiar – from clicking 'Plot' on the top menu for a gallery from which to pick the required type through to responsive mouse driven rotation of three-dimensional displays.

You can perfectly well use Unscrambler X to carry out tasks other than multivariate, but they are not the product focus. Asking for simple descriptors like the arithmetic mean, for instance, will typically involve deselecting other options (rather than selecting those required), and explicitly choosing or excluding the variables and cases to be included.

Cutting through complex data sets to underlying structures, however, is simplicity itself. Principal components, dendritic cluster and linear discriminant analyses, four regression approaches, support vector machine classification and multivariate curve resolution, are all handled through an intuitive and consistent interactive dialogue system echoed through all three investigation phases.

This emphasis has been further developed in the new look version X, with user interaction through the project based interface refined and simplified at various points. Definition of DoE variables, for instance, have been simplified to a single table from the previous arrangement spread across three different windows.

DoE has been overhauled behind the scenes in a number of other ways. Partial Least Squares, though still available, has given way to Scheff polynomials as the default approach to mixture designs analysis. D-optimal constraint handling has been improved. Plotting of response surfaces has been polished, as have various other points of detail

Attention to detail is a theme reflected throughout the new release, aggregating into strong usability and value added gains. Most areas of the progam see tweaks, improvement, new options and techniques. A new transform (median absolute deviation for quantile normalisation)  here, per-segment prediction diagnostics there. Improved model portability and flexibility. Extended matrix facilities. Various plotting adjustments. Behaviour adjusted to be more closely in line with Windows conventions, including clipboard operations and an increased use of shortcuts and some convenient extras such one click inversion of a range definition which, while it sounds simple, quickly becomes a valuable workflow accelerator.


For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori


Robert Roe looks at the latest simulation techniques used in the design of industrial and commercial vehicles


Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware


Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community


Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers