Cloud computing powers the profitability of wind
Wind power is a staple renewable energy source that is often described as relying on mature technology. In many ways, this assumption is correct as wind farms pop up across the globe in increasingly diverse and remote locations. But a lot of innovation is going on behind the scenes.
Many difficulties remain with the way we build and deploy wind turbines, which raises doubts over the profitability of such systems. The physical size of the turbines has increased and this makes such systems more complicated, increases the costs, and adds significant financial risk to projects that can cost hundreds of millions of pounds to complete.
But, while wind power does rely on mature technology, it has not reached an evolutionary stalemate. The tools powering the wind power industry are seeing incremental changes and embracing different methodologies to beat the challenges of reaching profitability and working in some of the most demanding and remote locations on the planet.
Some technologies, such as remote sensing technology Lidar, could remove the need for traditional meteorological masts completely. Other groups are investigating ways and pre-existing third-party tools to improve the existing met mast technology and optimise cost savings. Or could a hybrid of old and new technologies be the way forward for the wind power industry?
Focusing on Lidar
Lidar illuminates a target with an eye-safe laser and analyses the reflected light using optical components and processors to measure the speed of the wind. These devices have many practical advantages over traditional met masts, but their use has been restricted in the past, partly due to their cost (relative to typical onshore met masts) and partly to their slow acceptance within parts of the financing community, according to Graham Gow, head of product development at renewable energy consultancy Natural Power.
Gow added: ‘In the past five years or so, we have seen both of these issues addressed. Add in the fact that turbines are getting bigger and moving offshore, both of which means that associated met masts get ever more expensive – thus making remote sensing devices even more attractive.’
Lidar provides remote wind measurements from the ground level to plan for, enhance, or even control wind turbines so they can ‘see’ the wind before it arrives. The technology removes the need for tall mast structures and their associated cup anemometers for measuring wind speed. Masts can be expensive and difficult to install in remote locations, often require planning permission and can be tricky to install and maintain. Lidar represents quite a step change to the incumbent met mast technology. It is cheaper and quicker to install, easily movable after or even during projects and dodges the planning or safety considerations associated with traditional masts.
Lidar uses an eye-safe, non-visible, continuous wave laser beam at a range of user-defined heights to intersect the particles naturally found in the atmosphere, such as pollen and dust. The returning, changed signal is evaluated for its Doppler shift to calculate the wind speed. This can be deployed autonomously for years at a time, gathering this vital wind data to help plan for a successful wind farm, or optimise the performance of an operational wind turbine.
Future-proofing the software behind these systems is a complex task as the sensors are expected to work in remote locations for long periods of time without anyone checking on them. Jon Cage, head of software at remote wind measurement systems company ZephIR Lidar, told Scientific Computing World: ‘As a result, testing, simulation, and increasingly, test driven development takes focus. It is not enough to put some code down, check it works in the lab and ship it, as might have happened in the earlier days of computing on smaller scale projects or systems.’
Increases in computing power have enabled modular object-oriented code to take this strain and become more of a standard approach to enable, for example, unit testing to allow developers to run tests aimed at new features or bug fixes on their workstations. Cage added: ‘Whilst cloud computing and larger systems are available at increasingly competitive prices, freely available continuous integration software can be run on modest servers or workstations so that each and every change made to our algorithms is run against all of the tests and simulations that were written before, ensuring no issues have been introduced to the wider system whilst enabling new features and enhancements.’
Cloud computing is another tool that is helping the wind power sector up its game, as David Standingford, lead technologist at the Centre for Modelling and Simulation (CFMS), said: ‘We are seeing movement from most engineering companies to cloud computing as the economics are better and the security of the cloud has improved. Additionally, massive amounts of resources are available cheaply and bandwidth has increased to match the needs of the wind power simulation space.’
‘In five to 10 years, remote cloud resources will be increasingly utilised within the wind power industry as the technology matures and the cost savings and security of cloud systems become more apparent. Cloud resources allow engineers remote access, as well as greater agility and the ability to partner with other engineers on a variety of projects,’ Standingford added.
This hunger for more computational clout is taking hold across the wind power technology space and one of the biggest challenges is optimising the modelling and simulation software to match these increasingly computer intensive methods. This has led to two possible methodologies, according to Gow: one where progressively more complex engineering principles guide the simulation; and one where you give the system all of your data in one go and see what pops out at the other end.
For the first ‘computer intensive approach’, Gow added: ‘A good example would be our gradual move from static flow modelling to dynamic flow modelling to numerical weather models to combined computational fluid dynamics and numerical weather prediction models.’
The other ‘big data approach’ can be completely blind, like an aeroplane black box system, where you chuck all the operational data at a network and see what it discovers. Gow added: ‘The biggest issue generally is concerned with our ability to record more and more data, but being less and less able to make genuine use of it. The solution lies in our ability to bring engineering expertise together with our mathematical modelling abilities to bring genuine value added to our processes.’
Hardy hardware and software
Whether a computer intensive or big data approach is used, computational power needs to increase to match the engineers’ increasingly demanding simulation and modelling requirements. This can be achieved with modern day workstations, as Cage explained: ‘As the power of desktop machines reach the levels of what would have previously been considered ‘high-performance’ in yesteryear, and the tools we use evolve in orders of magnitude of efficiency, the option to run more testing and analysis on our ‘workstations’ becomes an increasingly viable option.’
It’s not just the hardware that has made this possible, the software has also upped its game, according to Cage: ‘Ever-improving efficiencies in higher level languages such as Python and accompanying scientific libraries are bringing the power of what was once the domain of complicated (and often expensive) mathematical modelling tools such as Matlab closer to the hands of engineers and scientists.’
‘The increased platform support for those languages means that algorithms can be developed faster in more human-readable code (which is easier to debug and maintain) and deployed direct to embedded systems. Systems within ZephIR that would previously have required careful optimisation on dedicated hardware such as FPGAs (Field-Programmable Gate Arrays) are increasingly being written at higher levels and being run on more generic CPUs,’ Cage added.
The move to third-party web-based solutions has also had further repercussions for engineers, as Gow explained: ‘We use less and less desktop-based tools, and more and more web-based (or generically client-server) tools, so the computers that engineers use are gradually being loaded less severely. This trend also enables us to provide laptops to engineers as their primary computer, thus giving our staff increased flexibility with regard to where and when they do their work.’
Teaching an old dog new tricks
The technology of traditional wind turbines is also evolving in terms of the design and efficiency of offshore wind farms. There are a variety of interesting projects beginning to explore this space and one such project is the Simulated Wake Effects Platform for Turbines (SWEPT2) project, which aims to improve the way that the wakes of turbines are modelling and the efficiency of offshore wind farms.
The Centre for Modelling and Simulation (CFMS), a Bristol-based simulation and modelling specialist, will work to establish the viability of GPU-based fluid dynamics modelling, which is a faster, cheaper and more scalable alternative to traditional software solutions.
This means that the offshore wind farm industry will be able to design larger turbines as well as larger arrays of turbines, and better predict wind farm failure. Such improvements will lead to reduced financing costs and allow lower carbon emissions, better designed wind farm layouts and cheaper environmentally friendly electricity.
Such projects are part of the wider picture when it comes to improving the existing wind mast technology and its profitability, as Standingford explained: ‘GPUs are highly parallel extremes of a trend towards many-core CPUs, where massive processing power is available at lower clock speeds (and hence lower energy) – allowing cost savings for large-scale simulation.’
Closing the gap
The wind power sector faces many challenges in terms of the increasing cost and scalability of its software. More powerful simulations, more flexibility, more scalability and higher fidelity simulations are all clearly needed.
There is no one correct way to proceed and all the technologies currently being used or investigated will play a part, as Standingford told Scientific Computing World: ‘A combination of Lidar, or other such remote sensing technologies, with pre-existing and new modelling and simulation tools present a good opportunity for the wind power sector. A fusion of the two could improve profitability.’
‘Lidar could, for example, be used to tune modelling parameters for the design of future turbine arrays, or leverage continual mast monitoring to provide updates to maintenance schedules or control strategies,’ he added.
This move to results in real time or, at the very least, minimal development cycles is highly achievable, as Cage explained: ‘As computing power increases, higher level languages advance and device connectivity improves, I think the gap between modelling and deployed solutions will continue to decrease. At the same time, the level and complexity of testing will increase leading to even more capable and reliable systems. You might find yourself coding new improvements or changes to algorithms and models and see the results appearing in another display in real time, reducing the development cycle to the barest minimum.’
It is unclear what the final physical system will look like within the wind power industry. No one can predict whether this hardware will involve a met mast system, Lidar system, hybrid between the two or some, as yet, completely novel technical incarnation.
It is clear that the software running behind the scenes will play a big part in the future of the wind power industry. Real-time views of the health and functionality of a wind power system will be demanded by customers. The software will be both the enabler and the looking glass through which the industry and their customers will see and judge these investments.