Skip to main content

Material values

Early in the film Up in the Air[1], characters played by George Clooney and Vera Farmiga start a casual relationship whose emptiness is emphasised in a pointedly tragicomic scene. Picking up one of Clooney’s airline loyalty cards, Farmiga asks ‘what is that, carbon fibre? ... I love the weight!’ and follows up shortly thereafter with ‘pretty sexy’. It’s a whole new way of looking at materials science.

Carbon fibre, now so ubiquitous that it can be wasted in Clooney’s loyalty card, was of course a fairly early stage in the down-scale migration of materials science. What was considered daringly tiny when I learned my chemistry is now referred to as ‘bulk scale’, the nanoscale has become every day, and quantum effects are the new hunting ground.

At the same time, however, the quest for the small has not replaced the need for the large or medium-sized view of things. Carbon fibre first emerged nearly 50 years ago as an industrially viable patent from the Royal Aircraft Establishment at Farnborough, to see immediate application in Rolls Royce aero engines. As Steinhauser and Hiermaier[2] put it: ‘Some of the most fascinating problems in all fields of science involve multiple temporal or spatial scales. Many processes occurring at a certain scale govern the behaviour of the system across several (usually larger) scales.’ Fitzgerald et al[3] comment more pragmatically, in a survey of cross scale computational methods, that ‘methods are now available that are capable of modelling hundreds of thousands of atoms, and the results can have a significant impact on real-world engineering ... the methods of molecular modelling are being used to solve engineering problems, despite the fact that they typically operate on comparatively short length and time scales.’ We now design and study materials at the submolecular level, but we still apply them at all stages up to the macroscale. The properties of those materials must bridge that divide – as must the simulation and analysis development loops which produce them.

Analyses, whether on test results or as part of in silico experiment, occur on a range of platforms from the highly specialised bespoke to the generic – the latter increasingly predominating. A quick dip into the literature on medical application of nanomaterials turns up, for example, a multi-level tendon deformation study, the effect of gamma irradiant sterilisation on polycaprolactone scaffolding and self assembling aggregates as an antigen marker, all in SigmaPlot, plus a study of joint replacement wear prediction in its stable mate TableCurve3D. Moving from medicine to archaeology, for a change, their elder sibling Systat examines the formative structure of thousand-year-old ceramic pots. Golden Software’s Surfer, usually associated with geology, crops up in more than one nanofilm distortion study for its ability to rapidly visualise large and complex data sets. Wolfram Mathematica is often the enabling environment behind the virtual modelling and experimentation phases that precede physical development, from military aerospace design in the USAF’s Wright-Patterson research laboratory through thin films to cosmetics.

Computation volume in materials science is high, even by modern data analytic standards. Developing strategies for computation across multiscale ranges is highly management intensive, and as much attention is needed to keeping track of analyses as to the analyses themselves.

Another illustration of the analysis volume problem is an important strand in materials building, down at the molecular level: detailed control of polymerisation to grow specific and repeatable structures for applications from lubricants through clay-based nanocomposites to biomedical elastomers. Reversible Addition-Fragmentation chain Transfer (RAFT) processes, developed by Australia’s Commonwealth Scientific and Industrial Research Organisation (CSIRO), are at the forefront of developments here.

CSIRO’s solution, both in the case of RAFT and also more widely, is the E-Workbook research data management system from IDBS – more specifically, its extended ChemBook variant. E-Workbook is one of the expanding number of applications (Wolfram’s Workbench, used by many materials modelling groups for Mathematica, is another relevant example) that look to the open source Eclipse Foundation for a flexible environment that encourages extension. Fitzgerald and his coauthors, quoted earlier, consider the computational scalability issue against the background of Accelrys Pipeline Pilot. Either way, the inevitable trend is the same as in many other areas: materials science is inseparable from cheminformatics. Quite apart from the efficiency in analysis itself, automation also eliminates human error and the resource absorption of shifting data or results from one platform to another.

The essence of such automation is to capture best practice in a ‘workflow’ or ‘pipeline’ (the difference lies in how the tasks are handled, with pipelines making more efficient use of resources), which eliminates waste and delay, unifies formats and standards, handles iteration flexibly to reach desired end states without intervention, and deliver results to whatever forms are needed – typically databases, reports and software code structures.

Carbon fibre mesh. (Image: Asma Hassan)

George Fitzgerald is Accelrys’ senior marketing manager for materials, and points to the time and efficiency advantages of moving experiments and analyses inside a single virtual management container. A mass of options can be screened much more rapidly in software than in physical experiment, narrowing them down to the top 10 or 20 per cent on which further attention can be focussed.

Materials science, as shown by some of the illustrations already quoted, doesn’t have to mean constructional materials like carbon fibre and Fitzgerald offers very different examples.

In a collaboration with Mitsubishi, stabilisation of a lithium electrolyte involving six thousand or so combinatorial library calculations. Informatics automated the process, Pareto assessments filtering out about a hundred candidates for further examination. L’Oréal, whose materials range from pigments through oils to antioxidants, is merging half a dozen internal databases into Pipeline Pilot, which brings integral understanding of molecules and substructure searches. Unilever, by integrating data, informatics and analysis, reaped a 90 per cent efficiency gain.

In a study searching for platinum catalyst replacements in fuel cells, complexity is multiplied by alloy crystal structures. By finding high level trends, groups and classes of calculations can be eliminated and correlation of energy level with lattice size gives a first approximation, which in turn minimises experimentation. Similarly, research by BASF, General Motors and the US Department of Energy into reburning of automotive exhaust hydrocarbons involves thousands of catalytic materials, but information management reduces the field to manageable size.

An atomistic representation of a polystyrene polymer showing regions that could potentially be abstracted into beads for a mesoscale level calculation. (Image from Fitzgerald et al[3])

Medicine is a major contributor and consumer of materials expertise, across a range from structural rebuilding to drug delivery vectors and bactericides. An example of the latter[4] is the modification of titanium oxide nanoparticles containing carbon for visible light activation of bactericidal actions at higher effect rates than hitherto. DNA mediated self assembly of components is a growth area (no pun intended) as is regenerative reconstruction in which tailored structures and surfaces encourage or support the replication of cells in favourable configurations. All of these, as with other fields, require high volume computation loops to derive optimum results from the age of possibilities.

Computing itself, along with associated electronics, is moving towards the nanoscale as it becomes increasingly feasible to embed or laminate it into other artefacts. Visions of foldable devices, or ones which shape invisibly to complex surfaces, drive research into films with thicknesses below a hundred nanometres, making them around three or four orders of magnitude thinner than the finest breathable membranes currently available for waterproof fabrics. At that degree of attenuation, a lot of layers could be combined in sophisticated ways without sacrificing flexibility; but it’s a trick that needs a lot of computation attention to pull it off in practice.

At least one company is running intensive analytic programs on a high performance computing array into the feasibility of generating such nanofilms in various degrees of semipermeability for temporary bonding to human skin, and even to more vulnerable surfaces such as the cornea. Possible applications under in silico investigation include carrier substrates for sophisticated biosensor arrays, protection against environmental elements from rain and industrial pollutants to chemical and biological warfare agents, temporary protection of burns or lacerations, reduced short-term fluid loss in arid conditions, and coded identification displays.

In the upper set, initial monolayer on a gold surface; phenanthroline is stable at 21°C, but not at 37°C; subsequent molecular modelling studies confirm that the monolayer desorbs at 37°C. In the lower set, new monolayer designed by modelling; modelling predicts that adding C60 to the phenanthroline stabilises the monolayer; subsequent experimental work confirms this. (Images from Fitzgerald et al[3])

On more rigid and less sensitive surfaces, nanofilms have an even wider spread of potential utility. One application already out of the development stage is a way of applying complexly layered stencils, resists and insulations for inkjet printed circuitry or other subtle surface effects.

Going beyond that, there are studies into how other components may usefully and durably be built in between the layers. The primary goal here is to build intelligent effectors that can act and combine in different ways. I was shown a prototype circle of multiple laminate that could ‘swim’ in a primitive sort of way, rather like a manta. To an even more limited extent, it could crawl ashore and move across land to another pool. It contained a combination of biosensors, which enabled it to approach some solute concentrations and avoid others. In defined circumstances, the manta could enfold an object and hold it; it couldn’t then swim, but these are early days. It’s easy to imagine a future in which more sophisticated descendants of this first mobile disk might operate in concert to gather, process and analyse data before collectively acting on the results. They could be very flexible and cheap research teams or pollution cleanup squads, for example.

Equally, food and drink are almost as intensive an area of materials analysis as medicine. In industrially developed societies, criteria for choosing one product over another have moved a long way from simple nutrition related issues. Manufacturers seeking a market edge for their products often concentrate on subtle variation of æsthetic factors: colour, scent, taste, texture, cohesive integrity or lack of it, behaviour in combination with other foods, and so on. A cornflake that is slightly too yellow or too red, or is not colourful enough, or goes too soft in milk, or just doesn’t smell quite as the consumer thinks a cornflake should smell, pays the price in market share. Control of these attributes, which used to be a hit or miss macroscale affair, is now addressed through nanoscale materials manipulation. Since changing one factor may well impact on others, this again raises the spectre of combinatorial data complexity and solutions, such as Pipeline Pilot and Materials Studio, are once again not far behind. Accelrys has a white paper[5] on application of its systems to food and beverage companies, including a number of case studies based around chocolate. One of these case studies, in which computerised image analysis is used to monitor what the manufacturer calls ‘mouth feel’ is reproduced here (see box: ‘In search of perfect chocolate’); others include eliminating bloom, odourants and flavourants, and taste versus cost.

So despite the clear differences between chocolate and carbon fibre, the æsthetics of confectionary bring us full circle to George Clooney and his ‘sexy’ loyalty card!

References and Sources

For a full list of sources and references cited in this article, please visit www.scientific-computing.com/features/referencesapr10.php



Problem: How do you create a product that has just the right mouth-feel? How could you modify an existing process to change product texture? How would you assure quality control of the new product? Solution: Data automation with Pipeline Pilot Imaging Collection

It’s well known that the texture, chew, and mouth-feel of food can be altered by the presence of bubbles. While bubbles do not contribute to food value, they do impact customer perception and marketability of a food product. Work by Nestlé and the University of Reading showed the relationship between bubble sizes and sensory ratings.

Chocolates with larger voids were perceived as less hard and less creamy. Smaller bubbles, on the other hand yield products that are perceived as creamier and harder.

Depending on the target customer, a company can tailor mouth-feel with processes that adjust – among other factors – the bubble sizes. Here software for image analysis plays a key role: nobody wants to count bubbles by hand. Software will be faster, more repeatable, and less error prone than human analysis of images. It makes it possible to test samples regularly and automatically alert the human operators to a problem in real time.

R&D data integration delivers the added benefit of correlating all known production factors with any failures – whether in bubble distribution or elsewhere. Trend analysis can identify factors such as particular batches of ingredients or operating conditions that give rise to unacceptable levels of low quality product. Predictive analytics can quantify the relationship between bubble size and mouth feel, enabling companies to estimate the reaction to new formulations – before they are subject to consumer testing.

The bottom line

Real-time, reliable image analysis of product quality identifies factors that lead to inferior product, well before it is too late to fix them. It also ensures more effective and consistent use of consumer preference models, likely resulting in lower costs and faster time to market. The Accelrys Platform also facilitates real-time integration with current quality control systems, minimising costly errors and unhappy customers.

This same analysis can, of course, be applied to any QA process that requires analysis of unstructured data. This includes, for example, the visual screening of liquids for clarity; or the analysis of analytical instrument data to detect contaminants.

Topics

Read more about:

Laboratory informatics

Media Partners