In a way, it's unsurprising that the world's reliance on mathematics has rocketed over the last decade. Throughout history, there has been a growing use and sophistication of calculations underlying advances in technology and science. What's unprecedented now is the tremendous rate of increase.

The rise of technical computing is not only a consequence of the greater use of calculations but also a major cause: it brings higher mathematics and computing to all technical people, regardless of their specialist field or knowledge of manual calculating techniques. It's for this reason that the injection of technical computing into traditionally less quantitative technical fields, such as finance, biological and social sciences, has brought about the greatest changes. These changes rely as much on the in-built intelligence of computing systems to determine how calculations should be performed as on the systems' raw computational power.

**Operations vs algorithms**

Mathematica pioneered the idea that commands should specify the operation to be performed rather than the algorithm used to compute it. For instance, users need a single command to 'Solve a differential equation' not a command each for 'Applying the Runge-Kutta method' or '...the Adams method'. Specifying the operation leaves the system free to pick the best algorithm based on the details of the problem, sometimes even optimising by switching mid-calculation.

At Wolfram Research, we've worked hard to improve this 'intelligence' of automating algorithm selection and switching, precision control and error checking. Mathematica is dramatically more 'intelligent' than it was a decade ago. This is particularly important as the range of techniques available, and the extent of specialisation, has risen greatly during this period. Needing to be an expert in computational details of everything you'd like to calculate has become increasingly limiting. And, if users have to specify algorithms manually, this in practice reduces the consistency and reliability of results.

**Computers do make mistakes**

As the world becomes ever more dependent on calculations, so their accuracy and verification of their correctness becomes more critical. Aside from using an inappropriate algorithm, one of the most common causes of error comes from the way numbers are normally handled by computers using so-called double-precision arithmetic. Most people are dangerously unaware of the limitations imposed by this kind of arithmetic, with occasional dire consequences. An alternative is arbitrary-precision arithmetic - Mathematica's approach not only allows calculations to any amount of precision, but uniquely tracks how uncertainties propagate through to the answer.

Symbolic calculations also have an increasing role. Before computers, most serious analysis was symbolic. Early computers and languages made massive advances in numerical calculations, and most applied work was subsequently performed numerically. Symbolic calculations can now be so adeptly handled by computers that they are again highly valuable in improving understanding, analysis, accuracy, verification and even the overall time for a computation. But these benefits only fully accrue if the symbolic and numeric computations are highly integrated so that they can share the same programs and problem specifications. It's no good bolting on a symbolic engine as an afterthought, or adding advanced numerics into a structure designed just for algebra.

**Educate and communicate**

Of course, not everything has advanced so well in the last decade - in particular, technical education. Fewer students are choosing to take technical subjects, yet more and more jobs need increasingly advanced technical skills.

For most jobs, what's needed is experience at transforming problems into a mathematical form, deciding what kind of answer is needed, getting a technical computing system to do the calculation, and interpreting the result practically. Perversely, much of maths education - certainly at school - is still primarily about manually doing the calculating step: that's the one part of the process that a computer can do better, usually much better. That this will eventually change, with technical software fully integrated in maths education, is inevitable. When and how are difficult to answer, but hopefully in another decade discernable progress will have been made. (For a fuller discussion of technical computing and education see http://www.wolfram.com/solutions/highered/usformat.pdf)

More certain is progress in communicating technical ideas. Traditional papers are focused just on results, with insufficient information to retrace the author's steps. They are being replaced by structured electronic documents containing programs, problem specification, and interactive graphics as well as text. With these the 'reader' can rerun the calculation, change parameters, see how this affects a visualisation of the result and so on. Using this new methodology, results can be directly applied to new situations improving the efficiency, accuracy and immediacy of their use.

Mathematica notebooks have provided this functionality for over a decade, and are now being married to XML-based standards for publishing on the web. Technical communication has lagged behind other web uses because there was no standard for encapsulating mathematical notation and meaning; in particular no standard contained sufficient information to enable technical computing systems to calculate from a web document. MathML - closely related to Mathematica's typesetting system - has largely solved this problem. At Wolfram Research, we've been trying to improve the technical publication process with the release of Wolfram Publicon. With Publicon, authors can produce articles ready for entry into automated journal submission processes.

**Getting the future in gear**

I would expect the next decade in technical computing will be about automation for ease-of-use, to distinguish further the roles of specifying the problem and performing the computation. This automation will manifest itself in a range of ways such as smart plotting and intelligent understanding of input. Alongside this will be the development of interfaces to optimise access for different levels of use and deploy the power to different devices.

Think of the development of car technology as a parallel. No-one would expect to have to understand the mechanics of a modern car to be able to drive it. This automation has brought tremendous benefits in the ease-of-driving, reliability, efficiency, safety, and performance of cars. And so it should be with technical computing systems. Solving your problem should not require understanding all the settings that are needed to make it happen. It does require meticulous design for usability, sophisticated engineering under the surface, and confidence from users in the robustness of the system. This continues to be Wolfram Research's focus. *Conrad Wolfram is director of strategic development at Wolfram Research Inc. *