The revealed grace of the mechanism: computing after Babbage

Ray Girvan traces the rise of the machines - both digital and analogue - in an era of mechanical computation

The dissident Soviet historian Roy Medvedev said of the course of history, that we can learn as much of our destination from the road by which we arrived (and even from the turnings that we didn't take) as from our present position. In the 1990 'cybernetic Victorian novel' The Difference Engine, William Gibson and Bruce Sterling memorably explored one such untaken turning. Their vision of an alternative 1855 showed a Britain in which Charles Babbage had successfully promoted his Analytical Engine, leading to a computer revolution a century early.

But was this so clear-cut as a turning missed? The failure of Babbage's ideas in their time can lead to a mistaken impression of a 70-year hiatus in computing between Babbage's death and the post-1940s rise of electronic computers. In reality, only the general-purpose programmable digital computer went on hold. This article aims to be a sampler of the many styles of digital and analogue machine inhabiting a thriving era of mechanical computation.

The last quarter of the 19th century was especially rich in calculating innovations. By this time, four-function manual digital calculators were already well-established as the cumbersome (but commercially successful) Thomas Arithmometers, developed by Charles Xavier Thomas de Colmar from a Leibniz 'stepped-drum' design. In the 1870s, though, Willgodt Odhner and Frank Baldwin independently patented 'pinwheel' calculators. Smaller and more efficient, these came to dominate the scientific calculator market - in part due to Odhner's astute marketing of his patent rights - and continued to do, in largely unchanged form, for nearly a century.

Such calculators were fine for arithmetic. But, as computer scientist Allan Bromley argues in the 1990 book Computing Before Computers, they were inefficient for operations beyond addition and subtraction. On a pinwheel machine, for example, multiplication and division needed multiple crank turns and carriage shifts. Square root algorithms existed but were even more tedious (in practice, first-order approximations were used). Analogue devices, even if less accurate, provided scientifically useful functions, quickly. Slide rules helped, having reached more or less their modern form by the 1850s, but more complex analogue machines soon formed the cutting edge of Victorian mathematical physics: calculus computers.

  • Small cogs in small machines. Left to right: pinwheel machine count mechanism; Brunsviga and Curta; Sandia MEMS gears and microchain (courtesy Sandia National Laboratories, www.mems.sandia.gov); simulated fullerene nanogears (courtesy NAS - NASA Advanced Supercomputing Division, Ames Research Center).

In 1814, J.A. Hermann invented the planimeter, a device for integrating the area of a plane curve. Various improvements, notably Amsler's polar planimeter (1856), led to offshoots such as the Integraph, independently devised by Abdank-Abakanoviez and Boys, which could integrate any Cartesian curve plot - that is, numerically solve dy/dx = F(x). But probably the most significant use of the planimeter principle was in the wheel-and-disk integrator, where a tracking wheel outputs the integral of a function input as the rotation of a disk.

Exploring possibilities
William Thomson (a.k.a. Lord Kelvin) was a prime mover in the exploration of the possibilities. Using a ball-and-disk integrator invented by his brother, James Thomson, Kelvin's 1886 Harmonic Analyser linked an array of such integrators to extract Fourier coefficients from waveform data. (Kelvin saw that such integrator arrays could solve differential equations, but was unable to overcome the problem of torque loss between stages). The Analyser provided data for another analogue machine, the Harmonic Synthesiser, whose practical application was to compute tide tables. Once the Fourier coefficients had been computed from measured data for a port, the process could be reversed. The Synthesiser contained a row of rotating wheels representing the amplitude and phase of the harmonics of solar and lunar components, a wire wrapped over the whole array producing the total.

Kelvin also invented a mechanism for solving simultaneous equations using the moments of tilting metal plates hanging on pulleys. Other significant names of the late 1800s include Leonardo Torres Quevedo, a Spanish physicist and prolific inventor who built several rotary analogue machines for solving real and complex roots of polynomials; and Michelson and Stratton. Their own Harmonic Analyser performed Fourier analysis, but using an array of 80 springs rather than Kelvin integrators. This work led to the mathematical understanding of the 'Gibbs phenomenon' of overshoot in Fourier representation near discontinuities.

A new century
In the early 1900s, Difference Engines looked even more obsolete. A variety of late 19th century inventions supplied the calculation needs of businesses: pinwheel and a few other calculator architectures (now increasingly electromechanical); the comptometer; manual 'slide adders' (a.k.a. troncets); and Hollerith punched-card machines (the roots of IBM in the USA, and ICL in the UK). The punched-card systems, initially for sorting only, quickly evolved into computing systems using multi-pass algorithms equally applicable to science. For instance, Columbia University's 1930s showpiece project on lunar orbital calculation used customised IBM bookkeeping kit, and the Cambridge Mathematical Laboratory later applied ICL punched-card equipment to crystallography.

Alongside this, technical computing continued in the Victorian pattern of inspired (and sometimes idiosyncratic) special-purpose machines. One of the most remarkable - not a scientific application but certainly a mathematical one - was the highly successful Automatic Totalisator, developed in Australia by Sir George Julius. This used a huge digital shaft-adder system to manage racetrack betting ticket sales and odds calculation. The first of many went into service in 1913, and Doron Swade of the London Science Museum has described it in New Scientist as 'the earliest on-line, real-time, data processing and computation system that [our] curators have identified so far'. History might have looked very different if Julius had applied his concepts outside this specialist field. Another interesting digital computer was built in 1926 by Derrick Lehmer of the University of California. His 'number sieve' used 19 bicycle chains of different periodicities that rotated to seek number-theoretical conditions for factorisation and solution of Diophantine equations.

More major developments centred on analogue computing, particularly building on Kelvin's work. Doodson and Légé refined the Harmonic Synthesiser to a 42-wheel version that became the mainstay of British tidal prediction, a Liverpool one remaining in service until 1962. In the 1930s at the Massachusetts Institute of Technology (MIT), John Wilbur built a Kelvin tilting-plate solver for nine simultaneous linear algebraic equations, and Vannevar Bush came up with the technical fix, the torque amplifier, necessary to link Kelvin integrators into a single system that could drive other devices. The result was his Differential Analyser, a general-purpose analogue machine for solving ordinary differential equations - although that generality involved physically reconfiguring the mechanical connections for each problem. Bush's later Rockefeller machine featured relay-based connections between components and punched-tape input, to remove the need for manual reconfiguration. It weighed around 100 tonnes, and had an accuracy of around 0.01 per cent on ODE problems in areas such as atomic structure, electrical transients, and ballistics.

Building on prototypes
Having seen Bush's earlier Analyser, Professor Douglas Hartree built a Meccano prototype at Manchester University. After he demonstrated that it could solve atomic theory problems, Metropolitan-Vickers built the full-scale Manchester Differential Analyser. At least three other establishments had them prior to WW2 (Cambridge University, Queen's University Belfast, and RAE Farnborough) and ultimately around 14 British machines were built, though none as advanced as the Rockefeller (many were still in Meccano).

Integrators also formed the heart of another historically significant line: the military analogue computer. After simple beginnings such as Arthur Pollen's 'Argo Clock' for WW1 battleship target tracking, this technology reached its ultimate development in the work of American scientists Ford and Newell, whose blocky and highly robust computers solved in real time the ballistics equations for naval and other artillery control during WW2 and later. Generally, WW2 forced a convergence of civilian and military expertise in mechanical analogue computing; but at this point the foundations of modern digital computing were also being laid.

Machines such as the Enigma-cracking 'bombes' (invented in 1934 by Polish mathematician Marian Rejewski and later enhanced by Alan Turing), Konrad Zuse's Z series and Howard Aiken's Harvard Mark 1 were a short-lived last fling for the electromechanical approach before electronic computing became the norm. Despite this, mechanical analogue computers didn't disappear overnight. Differential Analysers were still prestigious technology, one featuring in George Pal's 1951 science fiction film When Worlds Collide as the 'DA' that performed orbital calculations. Even the Meccano Analysers did useful post-war work. For instance, one went to New Zealand in 1950 for research into radio, river silting, and rabbit migration. The London Science Museum has another analogue machine, the MAC (Mechanical Analogue Computer) developed in 1958 by Air Trainers Link Ltd, which also made analogue flight simulators. The MAC could deal with 4th order differential equations, and was used for teaching at Imperial College.

Against this background of 'big' computing, smaller devices such as slide rules and desktop calculators remained important in the scientific workplace until the 1970s. Pinwheel calculators - the Swedish Facit, the German Brunsviga, and many other brands - continued in service. Later Brunsvigas, apart from the normal versions, came in double- and triple-register forms for scientific calculations with coordinates and complex numbers. Hand-cranked calculators persisted to the end, with sophistications such as a 'back-transfer mechanism' that copied the output back to the input register. The final, and most elegant, flowering of this technology was Curt Herzstark's Curta, which returned to the stepped-drum concept in a modified and miniature form. This pocket-sized 'coffee-grinder' was especially popular with car rallyists, but also had a following among scientists and engineers.

A mechanical future?
In the 21st century, it's unlikely we'll ever see full-sized Analytical Engines in action outside a museum. However, many special-purpose mechanical and electromechanical computers are still made, typically for devices where a physical interface and robustness is needed. These include some types of petrol pump meter, keypad security lock and car fuel injection system. On a smaller scale, MEMS (microelectromechanical systems) components are widespread as microscopic sensors and actuators; for instance, accelerometers used in car anti-skid systems and to trigger airbags. But the technology is encroaching into computer peripherals such as STMicroelectronics' L6671 accelerometer chip for stabilising hard drive heads, and near-future application to actual computation looks certain.

Some of the 'micromachines' produced by Sandia, a US Department of Energy national security lab, already have computational functions. The Recodable Locking Device, announced in 1998 as 'the world's smallest combination lock' is a MEMS hardware firewall for safety-critical applications such as weapon safeing devices. Its microscopic gears, about 300 microns across and made from polycrystalline silicon by integrated circuit etching techniques, could equally be used in more complex machines. Other architectures such as relay-based logic are equally possible; Professor Kris Pister and doctoral student Ezekiel Kruglick at the University of California, Berkeley, have constructed MEMS complementary inverters and NOR gates, and propose applications in hot or radioactive environments where CMOS chips break down.

MEMS grades into the even smaller world of nanocomputing. Even at atomic scale, some structures are rigid enough to act as components in mechanical nanocomputers: a potential means to exceed the approaching limit of Moore's Law for conventional circuits. At least with storage media, beta trials are already under way: Nanochip's MARE (molecular array read/write engine) and IBM's Millipede projects both involve memory chips with an array of MEMS heads using scanning tunnelling effects to read and write atomic 'bumps' on a thin-film medium. Complete nanocomputers are as yet conceptual. Some proponents such as Eric K Drexler and Ralph Merkle envisage atomic-scale geared mechanisms, using fullerene (i.e. carbon) tubes with teeth added using a benzyne reaction. In simulation, such gears are stable and mesh successfully. But it's hard to judge if problems of manipulation ('sticky fingers' and 'fat fingers') raised by critics such as Richard Smalley would make them impossible to construct.

Miniature Babbage Engines aren't the only proposed design, however. Professor Ehud Shapiro of the Computer Science Department at the Weizmann Institute of Science recently revealed the simulated prototype of a DNA-powered nanocomputer designed to process organic molecules it encounters as it crawls along a polymer tape; a literal implementation of the conceptual Turing Machine. Abacus-like devices using rotaxanes (molecules with movable rings sliding on thin 'handles') and sliding-rod mechanisms using rigid linear carbyne molecules have also been suggested.

This brings us full circle. Mechanical sliding rods performed the logic of Konrad Zuse's Z1 binary computer and of the recently-reconstructed ternary computer demonstrated in 1842 by Thomas Fowler, a Torrington inventor and contemporary of Babbage. Whatever the future of computing, it seems that the insights of a mechanical past will continue to provide fresh ideas for new technologies.


The Mechanical Analog Computers of Hannibal Ford and William Newell by A. Ben Clymer

Computing Before Computers edited by William Aspray

a portable analogue computer

The slide rule, until the late 20th century the archetypal calculator for scientists and engineers, has its roots in Scottish mathematician John Napier's invention of logarithms in 1614. In 1620, Edmund Gunter devised a calculation method using dividers on a single log scale, but invention of the slide rule proper is generally credited to the Surrey rector and mathematician William Oughtred. Although he ended up in bitter dispute with his ex-pupil Richard Delamain over their near-simultaneous invention of the circular rule in 1630, Oughtred's 1632 work Circles of Proportion and the Horizontal Instrument first described calculation using two adjacent Gunter scales.

The next two centuries saw innovations such as square, cube and log-log scales, and the slide rule more or less reached its modern form by 1850, when the French academic and artillery officer Amédée Mannheim designed the now-familiar rule with A, B, C, D scales and cursor. This further evolved into the double-sided 'duplex', the cube-scale 'polyphase', and custom rules with extra scales for, say, electrical engineering. Specialist rules, not necessarily based on log scales, catered for every eventuality: telegraph tariffs, sewer management and, in the Cold War era, estimating nuclear blast yield.

One important area of development was to lengthen the scale to go beyond the Mannheim's three-digit precision, but retaining portability. Though less popular than straight rules, circular rules provided some space saving (an eight-foot scale fits on a three-foot disc) and this idea led to elegant sealed 'pocket watch' designs, which reached their acme in the French Calculigraphe and the Manchester-made Fowler Magnum, a 4.5 ft multi-scale model. More radical variants that appeared in the 1880s were Edwin Thacher's Calculating Instrument and Professor Fuller's Calculator. Both were two-foot cylindrical rules achieving a precision of about five digits, the Thacher by multiple straight scales totalling 30 feet, the Fuller by a single helical scale 41 feet long. A less cumbersome 20th century rule, the Otis King, gave one of the best compromises between size and precision: its 66-inch* helical scale, giving four-digit precision, fits on a six-inch* telescopic cylinder.

Although the market for general slide rules collapsed rapidly when electronic calculators arrived in the mid-1970s, the specialist market is thriving. One established British maker, Blundell Harling, now produces promotional medical reckoners, slide charts and data wheels; and MH Mear & Company of Huddersfield a range of industrial and technical rules (for instance, to calculate turbulent or streamlined pipe flow). Slide computers are also still widely used in maritime and aerial navigation, where they have the advantage of functioning without electrical power. Even general-purpose rules, however, aren't entirely extinct. Concise in Japan still makes several conventional circular rules.

Naturally, the slide rule's heyday is long over. But with a strong specialist niche, an enthusiastic collectors' circuit, and a new generation to whom slide rules are a retro 'geek accessory', this convenient little analogue computer very much lives on.

*Editor's note: There was a typographical error in the orignal print version of this article which erroneously gave the dimensions as 66-feet and six-feet respectively. These have been corrected here.


For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori


Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware


Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community


Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers