Skip to main content

Benefits of being under new management

Some software products are created, developed, and supported by their user communities - a realisation of the anarchist ideal. Others are developed and maintained, over their long lives, by the companies which first created them. Others, however, are passed from hand to hand across the years, from one owner to another. This is true of software in general, and it is equally true of scientific computing tools; science, like everything else, comes down to business in the final analysis.

There is no rule that says one course of development is better or worse than another. Staying in-house (whether user-supported or commercial) may mean consistency or it may mean stagnation; equally, passing through many hands can bring the neglect of short-term attitudes or the richness of cross-pollination. And, of course, either course can mean a mixture of many things. I can think of several products that I have used over my time with Scientific Computing World, which have benefited from a change of hands. Several more, including some brilliantly executed gems, entered the maw of acquisition or merger amid much talk of investment and marketing commitment, only to languish and die.

There are, of course, many reasons for acquiring a product. It may be a side effect of merger for other reasons. It may be in order to buy a ready-made head start in a new field: Microsoft, for instance, are well known for this strategy. It may be a way to broaden the product range, bringing a new user-base into the fold alongside existing, complementary offerings: Corel's acquisition of the WordPerfect suite (including the Borland products previously acquired by WordPerfect Corporation to complement their eponymous word processor) is an honourable example. Or it may be in order to kill off a competitor, the relative merits or failings of the acquisition being irrelevant. There are times, too, when it seems that the buyer is not entirely sure why the purchase is being made, and the product enters choppy seas of uncertainty.

Systat 10.2's data sheet (background top left), output organiser (foreground bottom right) and graphic output sheet (centre).

Two statistical analysis products that have made very different odysseys through multiple ownership are Genstat (now under the direction of VSN International - hereafter referred to as VSN) and Systat (owned by Systat Software Inc - SSI from now on). Both have scientific computing markets, but with different profiles. Both have recently moved under new management. Both have completed initial cycles in their new homes that seem to bode well for the future. Although these two packages are their flagships, neither VSN nor SSI is a one-product company. Genstat, a broad and massive asset on several platforms, has an initial VSN stable mate in the more specialised AS REML (a self-contained toolkit for residual maximum likelihood methods). Systat is joined in SSI's opening repertoire by Autosignal (a collection of signal-handling tools), Peakfit (exactly what it sounds like!), Tablecurve 2D and Tablecurve 3D (products which, for two and three dimensions respectively, fit approximating functions to datasets).

A selection of views during Peakfit 4.11 analysis of a concussive disruption event, including smoothing (top left) baseline elimination (bottom right background) and third derivatives (top right).

With one exception, I've had all these products hard at work on various tasks for some months, now. Most of them were involved in late stages of the search for a pollution source last summer (The Return of the Swamp Thing, SCW September 2002), and have moved on to other projects since. The one exception, AS REML, is an interesting case: most of the RML tools are incorporated into Genstat, but with a delay (the stand-alone product receives RML upgrades first), and VSN also provides them as a module for Mathsoft's S-Plus. That last fact seems an appropriate continuity link for me to say that I do not see any products here as competitors (though their companies may not agree).

Genstat and Systat, for example, are different analysis products with different philosophies for different users. In my own work, I use three separate generic analysis products for most things; this is not a reflection on any of them - I just see each of them as the best possible choice for a different set of my own requirements.

One of the first things to look for in any new owner is evidence of positive planning for the future maintenance and development of the product. Both companies invite confidence in this respect. SSI opened with a full-scale revamp of Tablecurve 3D, a worthwhile upgrade to Systat, and incremental developments elsewhere. Systat itself is in the process of a major 'version number' upgrade (from 10.x to 11.0), due for release in the third or fourth quarter of 2003. VSN launched a new major version of Genstat, harmonising version numbers (at 6.1) across platforms regardless of previous nomenclature. There is a declared VSN development calendar for Genstat: major releases in May of each year (you can expect version 7, therefore, about three months after you read this; see below), and a service pack in September.

Another indicator is the attitude (or lack of it) shown by those associated with the product's past development; here, again, both companies gather brownie points. Roger Payne, involved with the development of Genstat from the very beginning, remains a driving force in VSN as its R&D director and deputy MD. Lee Wilkinson, Systat's original author, gives his blessing to SSI's acquisition of it from the company which now employs him. Ron Brown, who authored Peakfit and the two Tablecurves, has issued a very warm public endorsement of SSI's commitment to the products, and continues to feed their development and upgrade programmes.

So much for rosy present and future, but the pasts of the two product ranges have been considerably different, and so are the two companies themselves. Both are, in a sense, 'spun off' companies, and both are characterised by infectious enthusiasm; in other ways they have little in common.

VSN is the offspring of NAG (the respected, not-for-profit Numerical Algorithms Group) and the Institute of Arable Crop Research Rothamsted Experimental Station (RES for short). Genstat's history is a tidy one, showing all the right moves. Born at RES, it was the love-child of dedicated researchers doing groundbreaking work in the development of computerised applied statistics. In its way, it was significantly instrumental in shaping the world as we now know it. A stable, supportive upbringing in the RES home took it to a respected position with a loyal user base - primarily, though not exclusively, in life sciences and within the British Commonwealth, leaving home to be nurtured and marketed more widely by NAG, who oversaw the introduction of a GUI interface for the Windows/PC platform. Now it moves out of the 'not for profit' cradle as the centrepiece of a dedicated marketing company within the same clan structure.

With Payne at the helm of VSN is CEO Roope Aaltonen. Finnish born, US and UK educated, Aaltonen prompts loyalty in everyone to whom I talked. All efforts to get attributable, on-the record background information on him that is not already in the company's public material came to nothing but, reading though material garnered here and there from his past career, he is obviously an infectious motivator. 'The new regime is exciting, forward thinking and a joy to work for!' was a typical comment from within VSN; the same person added that '...having him and Roger so prominent gives ... the double edge ... Roope is one of those rare individuals ... genuinely a terrific manager of people. People are given the freedom to express ideas ... to work in the manner that is beneficial to all. This, and the relationship between the whole team, make VSN a good and exciting place to be.'

SSI, although nominally based in California, is part of India's new burgeoning and energetic high-technology sector - apparently an offshoot of Crane's Software. Most of its products were originally marketed by companies representing their respective authors, and made dramatic additions to the numerical computing landscape. Systat itself was a heavyweight engine with particular strengths in the social sciences, with manuals which were better than many textbooks. Tablecurve brought usable modelling to consumers for whom realistic computing solutions were previously absent. There are limitations on what such companies can do to promote their creations, in a world dominated by large corporations; although they pushed those limits, in the end it was necessary to seek marketing partnerships. Wilkinson's Systat went straight to data giant SPSS (where I, like many of its users, hoped that it was about to blossom) and became its property. Brown's set of products arrived there via a previous home at Jandel Scientific, who were themselves bought out by SPSS in 1997. In this case, though, his own company retained ownership until the recent sale to SSI.

SSI does not discuss the previous history of its products but some users of those products, and others unconnected with the company, suggest that the years with SPSS were characterised by neglect. This may be unfair. Systat made the transition from 16- to 32-bit under SPSS; improved its interface considerably, and gained several new modules crucial to future growth, although this was followed by a plateau and user-support did not seem to keep up. Ron Brown comments, in connection with his Tablecurve 3D, that 'the development partnership ... for the ongoing evolution of these products did not come to fruition ... AISN [Brown's own company] continued [its] development and maintenance...' and several Systat users asserted that the transfer to SSI had come, as one put it, 'just in time to prevent my defection'. Praise for SSI's handling of the products since acquisition is pretty much universal. Not every thing went right in the run up to reviewing SSI's products. As a start-up company, it still has some hiccups and some things to learn. But always, behind those hiccups, as I talk to people there I can feel a determination to get things right and keep them moving - qualities which many a more established company would do well to emulate.

Systat has seen a respectable upgrade since the move to SSI. Much of this is improved capability or execution in existing areas, which may not be headline-grabbing but makes a real difference where it matters. Some features have been added, but they remain within the existing ethos, which is reassuring. A separate command file editor, FEdit, is a welcome addition that opens from the 'View' menu (an unusual but logical position which says quite a lot about how the development team sees and thinks about functional structure) or as a stand-alone program which is very handy. An unusually easy way to set up an arbitrary number of n-tiles, as well as centiles, is welcome; working rapidly through a number of settings, at increasing resolution, is a good way to get an intuitive feel for dispersion. On a couple of occasions in the past months, this has been the source of early breakthrough insights, which significantly shortened the study cycle. Regression tools have been enhanced; Bonferroni and Sidak are available for multiple measures studies; for those using BDMP, a dedicated menu appears.

While Systat has given its name to the company, and is inevitably seen as SSI's central pillar, the supporting acts are at least as important. All of these come from Ron Brown, a chemical engineer with a background in 3M, Eastman Kodak, and various Silicon Valley companies before he founded analysis software company AISN, which bred the products now in SSI's hands. His passion for his software vision is obvious, as is the benefit to his products. The new all-SSI release of Tablecurve 3D is the show-stopper (see below) but the 2D version is valuable too; it remains to be seen whether this will get the same dramatic treatment as TC3D.

Beneath the interface, a lot of serious thinking has gone into development of both; all assumptions are well founded and, where appropriate, cautious rather than flamboyant. In the six months I've had them, both Tablecurve products have seen strenuous use in a wide range of demanding contexts - from archaeology to ecology, communications to cleaning, pharmacological treatment planning to production line optimisation - and have never put a foot wrong.

Peakfit, like the Tablecurve packages, is a good, well-engineered model for what such a product can be - and has some unique features besides. It has better communications with other data handling or storage programs than many of its ilk, which is an added bonus. Autosignal, unlike the others, is not SSI property but a licence - with Brown and AISN retaining development control. Such tools are specialised, by definition, and there's not space to cover them here (though Peakfit is currently on duty with a project studying catastrophic changes of state, and Autosignal with a satellite data-cleaning study, which may bring them back into these pages later). Autosignal is still an early version (at 1.6) and will presumably grow significantly in the future, particularly in its interface, but already has an extensive capability in the upper reaches of its application type. Both responded well in field trials. The future looks bright for all of these tools, and so for market diversity.

Each company represents, for its main product, a change of outlook. Each product has in the past successfully colonised an evolutionary niche, living as a dietary specialist; both have now learned to be omnivores, and are emerging into the larger ecology of the analytical software market. Genstat is a large and powerful creature, its physique placing it squarely in eye-level company with the others giants already out on the plains; Systat is smaller and leaner, running in co-operation with a pack. Both have parent companies that show all the drive, belief, commitment, and confidence necessary for future success - which has to be good news for their own constituencies in particular, and scientific software users in general.

Tablecurve 3D The two Tablecurve products are distinct, with differences which should not be glossed over, but they have much in common. Since the 3D version is most recently and radically developed as an SSI release, it's the one on which I've concentrated here.

The first aspects you notice in a package are those relating to the user interface. In this case, you notice an MS Office XP look to the toolbars, which can be free-floating or locked to chosen positions that are saved from session to session. Placement and state of windows can also be saved and reimported.

Files are managed through the increasingly standard notebook arrangement, on the hierarchic, multipane model of Windows Explorer. The explorer is in the left hand frame, and fits in the main pane with numerics below. The main pane can be animated; I confess that I haven't actually found an application for this, but I've derived a lot of enjoyment from trying it out! For those who can remember how to fly an IntelliMouse (or similar) while concentrating on other things, surface graphs can be rotated in the tilt, roll and yaw planes (select which plane with a wheel click) using the scroll wheel. Personally, I get confused by this sort of thing, but everyone else I know took to it immediately. A complete XLS file appears as a single data source at left, with its contained data sets depending from it; multiple fits can be added to the same source. Drag and drop is extensively implemented; moving, appending, copying and repeating an operation on new data can all be managed with the mouse.

Relations to MS Office are also functional as well as cosmetic: data can be imported from Excel releases up to 2002 (XP) and saved to Excel 95 or 97/2000/XP formats for use elsewhere, and most output can be sent to MS Word 95 or later (via streamed real-time export to the application itself) or to an RTF file. Direct links to a database would be nice to see in a future release.

In the meantime, of course, it can be achieved with little hassle through an XLS intermediary file. With an XLS file as the data source, there is batch automation for multiple similar data sets to the same surface fit, estimation or smoothing; the automation is native to each procedure, requiring intervention only at the shift from one procedure to another. For example, in a project comparing fertility impact against the same pair of environmental variable sets across 37 study sites, the 37 sets of x,y,z triads were copied to a single sheet within a workbook. One automated Tablecurve 3D pass pre-processed the data, which were saved back to another Excel sheet for reference; a second pass fitted it to the selected model. The output was then sent to a Word document for distribution and discussion by the project team. There is DLL support for external acquisition if you want to write an instrument interface, or to 'hook' the procedures from inside other software but Tablecurve 3D is, first and foremost, a superb tool for interactive 'cruising' in data exploration and discovery. Graphics are now in a high 'photorealistic' resolution, with an improvement of better than four times over release 3. Colour rendering has been upped by a factor of five as well, giving impressively smooth and detailed surfaces and 'bandless' spectrum gradients.

Interaction with other software is not restricted to MS Office components. Tablecurve 3D reads data from SigmaPlot 2000 and 2001 formats, and writes back from the surface-fit review. The exported JNB notebook files, when opened in SigmaPlot, contain native 3D surfaces and scattergrams. Surface fit models can also be sent to Matlab as M-files (although I haven't applied this in a live situation, I tried it out extensively on the bench and it seems to work a treat) to C++ (I didn't try that one at all) and Java (while I didn't use this one myself, one of my students applied it to a landscape analysis assignment, and reported it to be 'dead easy').

On the actual business end, robust fitting allows scale-invariant fitting for Lz and P7 minimisations using normalisation scaling. Curve selections are effortless. Evaluation addresses a single sorted list window on x, y, z or |C|, and exports in different formats tailored to best suit the spreadsheet. Evaluation sequences can be saved, and automatically update to reflect equation and algorithm settings.

Next up in Genstat - hints at version 7
VSN are tight lipped about details of the next release, but the general outlines and highlights show a continued intent to extend Genstat's traditional user base.

    The arrival of new menus and analyses for six sigma SPC, while obviously applicable to the 'Genstat home ground', nevertheless suggests a move on new industrial territory. The same is true of new survival analysis tools.
  • Climatology (which has a natural affinity with agriculture, but has not been explicitly colonised by Genstat so far) is specifically mentioned in connection with facilities for the display and analysis of circular data.
  • Extensions to multivariate analysis and REML prediction facilities will be a welcome consolidation and growth of the existing hinterland, but of no less interest to new groups who may be looking to Genstat from elsewhere.

Genstat 6.1 - beyond the swamp

Genstat 6.1 output window, input log, scattergram showing thumbwheel style 'dolly' controls, and data sheets.

Apart from live use on the swamp pollution project, the past six months has seen Genstat shadowing my own software on a wider range of tasks than its predecessor; the reasons for that mostly come, one way or another, under the heading of 'usability'. Usability can and does mean many things - more accessible, more responsive, more power on tap - all of those things are here.

It will probably grieve all the hard-working people who have made Genstat what it is to hear me say so, but the really striking thing about release 6 is how far the Windows user-interface has come since version 5. The wait may have been long, but the result is stunning. Then it was a bolted-on extra for people who wanted to do a bit of explorative work between serious batch or CLI sessions; now it is a match for anything on the market. Having said that, I must add that developments under the bonnet have been equally impressive - just not so immediately visible.

One prominent productivity move, important in the hurly burly of life outside traditional Genstat enclaves, is better interaction with general-purpose spreadsheets - particularly Excel. A new wizard walks you through the import of data from Excel, a process which is, in any case, more friendly than it used to be. It's also refreshing to find that direct read-access, though not write-back, is available from Quattro Pro files right up to the current version 10. There is a customisable spreadsheet toolbar, and the active sheet can be accessed from any window. The worksheet in any such package is obviously not the same thing as a generic spreadsheet, and cannot (nor should it) emulate one beyond a certain point. Nevertheless, Genstat now minimises differences which are not functionally necessary, and thus greatly facilitates use. Conversion of data types between the two environments is handled well, with nice attention to detail; from a column label modifier which defines dates, for instance, to provision for 'stacking' data from multiple Excel columns to a single Genstat variable. Some aspects go beyond Excel; column editing (a more central issue in data analysis software than in its generic cousins) is more flexible and efficient, while copy and paste are extended.

Unlike a spreadsheet, or many data analysis products for that matter, the sheet is not given a default size; it is created dynamically to accommodate the data, with resulting economy of memory. Other tricks have been successfully learned from the generic sheet, though; conversion of data type to text, for instance, is automatic if a text item is typed into a cell of an evacuated variable.

Files other than spreadsheet can be read directly: Minitab, Statistica, State 7, Windows bitmap and sound files. SAS transport and ArcGIS can be written as well. Beyond this, of course, transfer is possible (subject to the row limit) to or from any package that reads or writes Excel files.

I know I've said nothing about the actual working gubbins. There are extensive changes here at the sharp end, too: additions, revisions, enhancements and developments. A set of new calculators, more sophisticated matrix handling. Better and easier access to tree handling and other multivariate methods, with direct links to graphic options to their menus. Multiple experiments and repeated measurements augmented. Extension of the regression menus. New functions, new options and arguments on old ones. And yet ... all this, too, is really another aspect of the new usability. Genstat has always been a rich language in which you were never constrained by the crude procedure count; now you can often do it more easily, more elegantly, more quickly, more directly and with less hassle. With this release, Genstat moves firmly out of the academic hills into the open plains of the marketplace - well equipped to cut it with whatever comparisons it may meet there.



Media Partners