Easing the way to fuels paradiseTweet
Using a novel combination of maths and text software, Felix Grant has made it easier to explore the uncertainties of alternative energy policy
In my principal job of scientific consultant, I have come to realise that the viability of such 'freelance' work relies heavily upon the principle which novelist Graham Swift sums up as 'make a little do for a lot' - in other words, 'synergy'. It doesn't always pan out, of course. Sometimes I have on hand several brand new commissions with no overlap whatsoever, which is hard work. Most of the time, though, there will be opportunities to build on previous work. Then there are the rare occasions when everything dovetails perfectly - which is what happened with a recent set of assignments. All of them wanted analyses of energy issues for internal publication and discussion. There were several single-focus studies of fuel cell, solar or wind power competitiveness, and there was a collaborative experiment by a charity and a civic authority to explore ways of facilitating an open debate on energy options for the near-future.
This is an emotive area, and passions run high. We live in times when public debate is open and public perceptions can have a much greater impact, these days, on the future of a technological development. Impartially informing that debate is not always easy, requiring not only conscientious balance but clear presentation - along with all the evidence.
The arguments are both complex and equivocal. The technologies and economics are in rapid flux; the available evidence being a mix of past experience (whose applicability declines sharply with time) and modelled futures (based upon necessarily untestable assumptions) with very little in-between. Statistical analysis is necessary to provide the former, mathematical modelling the latter. Every argued case comes down to how those future assumptions are justified - often in terms of the past data.
Depending on one's viewpoint, nuclear power can be the salvation of the biosphere from heat-death, or a fast route to the dystopic futures of Judge Dredd or Riddley Walker. Fuel cells may be the answer to our petroleum fears or the successor to them. Wind power used to be the golden path for environmentalists everywhere, but it has acquired a new face as the despoiler of beautiful landscape and murderer of endangered bird species. Similarly dramatic dichotomies afflict solar, geothermal and wave-power sources.
The underlying common currency of the debate is energy economy, which boils down to the 'total cost of ownership', in terms of energy input per megawatt output. This is a fiendishly difficult measure to pin down; any attempt to be informative in this area must necessarily inform about processes, stochastic ranges, and uncertainty. There are mechanistic elements, which can be predicted to some extent, but it is also significantly dependent upon socio-political trends, unknown states of imminent technology, and so on.
A given wind turbine may produce an output of p megawatts - p is currently about 3 and, while it will almost certainly be more tomorrow, the trends are reasonably clear in the short term. Against that must be set the quantity q of energy consumed in manufacture and maintenance - that's more difficult. Where will the energy for q come from (presumably sources other than wind power, at least in the immediate future)? What will be the impact on that cost of growing wind power competition? What is the energy cost of transporting a rotor arm bigger than the wingspan of a jumbo jet from point of manufacture to point of use? Rotors are getting bigger all the time (regularly rupturing design limits of not so long ago) and there is a demand for taller towers, permitting bird flightpaths to pass underneath; what will be the combined energy cost implications of increasing shaft height? Over what length of time, as new plant begins to require replacement, can q be amortised? How will the energy costs m of maintenance, increasing with age, compare with costs r of replacement? None of this can be easily extrapolated from past experience, and we haven't begun to examine other costs or benefits such as impacts on tourism, wind reduction over agricultural land, and so on.
Some patterns are discernible, though not indisputable and usually socio-political. The level of approvals for wind power development in the UK in recent years has followed a curve which appears to be perfectly cubic - but whether it will continue that way, or change with a shift in public priorities, is material for endless dispute. Total world electricity consumption, currently on the order of 1010 megawatt hours per annum, is also on an upward curve. Perfectly good technologies exist for voluntarily reining back consumption, but continuation toward some future Malthusian ceiling seems more likely.
All of this is necessary if debate is to move beyond the bipolar. Communicating technical or scientific information to an incompletely defined audience is always a balancing act. Mathematics is often essential for accurate, concise, or complete communication, but can also be a barrier to any communication at all. In a case such as this, where the specific brief is to make information available to everyone within the client's organisation, the tensions are multiplied considerably. One end of the audience spectrum wants, and is entitled to expect, content in plain natural language. The other end wants to access, and interact with, detailed and sophisticated mathematical content at a high level. Hardest of all is to cater flexibly for the range of expectation and background between those limits. As I discovered recently when researching the use of mathematics software by non-mathematicians, however, there is a lot more that can be done than is often recognised.
About 15 years ago, there was a computerised rôle play called Balance of the Planet (Crawford, 1990 #7550), well ahead of its time, which pointed the way for this sort of user-centred information giving. The user was placed in the rôle of world dictator, running a series of five-year development plans within a game scenario. Simplified mathematical models for a range of macro-environmental mechanisms were implemented, the inputs being presented to the dictator as scalar slider controls. The sliders could be varied at will, decisions reversed or reconsidered, in the light of brief advisory panels provided within the game, until eventually committed by the dictator. The game then moved forward five years and the outputs from the interlinked models were presented as a report on the five-year plan. Successive five-year plans led to either success or failure in maintaining the balance of the planet within viable limits. With a background in war-gaming, I was working for a large agency which made available to me huge (by standards of that time) mainframe analytical futurology resources, but Balance did something new in psychological and pedagogic terms which changed my view of the world. (A SourceForge project, launched last year, is remaking Balance under the GNU General Public Licence.)
It would be ideal if one solution could meet every need, but that is a distant dream. A more modest approach, through progressive layering of delivery, seems to work well. The structure of help files in packages such as Maple and Mathematica provided the first steps: broad responses refined through successive panes, with collapsible outlines hiding or revealing levels of detail, under the reader's control at each stage. Even this is not enough: the base level of delivery has to be a straight, linear narrative, if it is to be truly accessible and un-intimidating. The solution I've arrived at is provision of four fundamental access streams: two textual and two mathematical. Between these streams lies a connective tissue of reference material (static or animated graphics being the main component). Implementing all of this represents an instructive audit of technical and analytic software capabilities. The truest incarnation of this structure will be intranet-based, but it must be capable of at least partial translation into PDF (Adobe's Packaged Document File format) and paper subsets.
The textual streams are both discursive (the reader can simply move through a linear sequence of text frames analogous to the pages of a book) and structured (the same frames can be accessed, Wolfram style, from a hierarchic set of interlinked menus). The menu system can be invoked at any time from within any frame, or dismissed to pursue the linear route. From within either route, one (or both) of the mathematical streams can be accessed on demand.
The mathematical streams are based on two different products, both offering authorship and readership tools. I started with just one, based on Wolfram notebooks, but knew that they could not be the whole answer. Ultimately, the reader with 'rusty' knowledge or a willingness to learn was being asked to jump too great a gap between the text and mathematics streams. After some experimentation, and pilot trials with the client, the answer seems to be a product intended for educational courseware use. Like Mathematica, LiveMath Maker encourages the development of collapsible and expandable outlines in mixed text and mathematical notation. Additionally, it adds mathphobe friendly features such as bubble help and presents a less austere appearance. There is sufficient overlap in psychoperceptual assumptions to allow transition between the two. A user gaining sufficient confidence from the LiveMath stream to tackle an expansion in the Wolfram one will find that most of the skills acquired can be carried across and adapted easily.
A montage of some steps through the debate site taken by a pilot user with minimal mathematical background, using LiveMath Maker to examine some of the assumptions behind the site's models. The window at top left, offering brief text headers, expands to show (bottom left) the definitions in use for solar power models. On the right, a preconstructed deformation model is altered (in an equation obscured here) to observe the resulting effects. At bottom edge centre is the LiveMath Maker tool palette.
Within the streams, the extent of use will depend on software provision. As an expository system, either stream meets the client's requirements through free reader software. Both are capable of much more, however; content is live in both formats and, while the material in store is protected against change, users with access to copies of the originating software can play with the onscreen variables and equations to test the effect and likelihood of (for example) variation from stated assumptions.
The connective tissue between the streams is being compiled in a catholic range of other software. Any competent statistical package can do the analysis involved, but each brings its own particular strengths and viewpoints to bear. Future modelling is usually (but not always - chaotically sensitive outcomes can be better illustrated through statistical analysis of repeated runs) the preserve of the mathematical products. I'm receiving valuable help from people on the cutting edge of several alternative energy technologies, and much of their data came with testable analyses preconstructed in GenStat. An efficient communication device is the animated graphic, usually a histogram or distribution curve, demonstrating the ways in which likely outcomes respond to variation in assumptions; simply generated from separate step frames in a generic graphics application such as AnimationShop, they are simpler still when the process starts within an analytic package such as Statistica. Statsoft also offers an online HTML statistics textbook.
Once the structure is in place, the content can grow with time, not only 'top down', but from the moderated contributions of exploratory users on the Wikipedia model. The value which can be extracted from it will still depend upon the willingness of its consumers to invest (mainly in terms of commitment) in the software to interrogate it to maximum effect. All of it can be accessed without any such investment, using relevant web browser plug-ins, but interaction is enhanced with full software applications. LiveMath Reader offers an impressive level of interaction, but the LiveMath Maker generating application takes readers much further into the potential of the material for 'what if' and 'suppose that' investigations. LiveMath Maker has the added advantage of remarkably low purchase cost, and as shallow a learning curve as any mathematical application can be expected to achieve. At its most compact level of collapse, material can be read almost as plain narrative outline; as the outline is opened up (by clicking on the outline markers), increasing layers of explanation are revealed including graphs, and if required the settings to generate (or change) them). Ten minutes or so is enough to get going, for anyone familiar with use of a computer for other purposes.
Moving up a level in both cost and capability, Wolfram's notebook format offers the advantage of being accessible through a number of applications - from Publicon which does no active mathematical work but provides publication quality handling of extracted material, up to Mathematica 5.1 which chews on industrial strength symbolic manipulation. Preparation of the Wolfram stream material so far has been in Mathematica, with subsequent refinement in Publicon. In between these, and most likely to appeal to the bulk of the intended audience, comes the CalculationCenter line which has been renamed Mathematica CalcCenter (MCC) and given a significant power boost in the third release.
A more mathematically confident pilot user of the debate site explores the behaviour of a matrix model in my late beta copy of Mathematica CalcCenter, using the web style palette structure in the left hand frame.
MCC sees further development of the friendly interface, including developed integration with the Windows environment (MCC runs in MacOS X as well), and at the same time now becomes a true 'Mathematica lite'. It has the numeric vector and array handling to permit extensive interaction modelling over a realistic range of variables, exact fraction support, and increased execution speed to support them. Once again, 10 minutes play is enough for a new user to become productive. In the present situation, newly acquired input/output formats handling enables use of data which would not have been available to CalculationCenter 2, and better XHTML export (including cascading style sheet generation) permits generation of supplementary material in the same way as, though at the appropriately higher level than, LiveMath. A few details are still being finalised in the late beta I'm using at the moment (the market release will be out by the time this comes to print) but it's still clear that MCC significantly extends its hold on the market layer below full Mathematica.
The client has run extensive tests of the basic system on a pilot user group, with automatic logging of interactions by user, and it seems to work well. While some started by simply reading through the basic linear text, all were drawn into the menued stream, and at least the LiveMath one as well. A majority also ventured into the Wolfram powered stream as well, and it wasn't only the scientists or technicians who stayed there - almost two-thirds of users with no mathematical training continued to refer to this stream for expansion of material in the menued text or the connective tissue. Equally interesting is that every user, including those with a mathematics or physics-based first degree, spent a significant amount of interaction time inside the LiveMath stream. The log shows that the latter group used the LiveMath stream for a minimum of 8 percent of their time, mainly as a ready source of quick entry clarification and expanded explanation on entering areas unfamiliar to them.