If there’s one thing that can make or break our time off, it’s the weather. We certainly seem to have a lot of it, and it can seem like the forecast never brings us what we want. Weather and the climate are complicated topics, so here’s just a taste of the ways we’re helping to make forecasts more accurate, and to learn more about climate change and its impacts.
Clouds form from tiny droplets of water, around dust particles (aerosols) in the atmosphere. The relationship between cloud formation and our climate is very complex, and not well understood. According to the IPCC, aerosol particles and their influence on clouds remain the largest uncertainty in our understanding of human-induced climate change. Scientists working on the CLOUD experiment at the Proton Synchrotron at CERN have been investigating how atmospheric vapours closely related to ammonia can combine with sulfuric acid to form aerosol particles at rates similar to those observed in the atmosphere. Their work will improve our understanding of cloud formation, and help to inform future IPCC reports. The main source of amines in the atmosphere is human activity – primarily animal husbandry – but as amine scrubbing is likely to become the dominant technology for capturing carbon dioxide from power plants, our amine emissions are expected to increase.
Researchers in the Indian Ocean realised that clouds drifting downwind of cities had a different level of reflectivity. Air pollution may affect how reflective clouds are, enclosing water droplets in an oily layer and preventing them growing to form rain droplets or shrinking to evaporate.
However, everything released into the atmosphere is oxidised. The atmosphere acts like a cool fire, and the pollutant casings (called surfactant films) are short-lived. They oxidise with ozone in the atmosphere, freeing the cloud droplet to expand or contract according to local conditions.
6 micron droplet levitating in the optical trap set up by two infra-red laser beams focussed through microscope lenses, shown in red. Levitating droplet is so small that it is only visible by scattering light from its surface and is seen as a white dot in the centre of the sample cell. Click image for larger version.
A team from Royal Holloway University of London, working with CLF scientists, has been investigating the rates of different chemical reactions that remove these surfactant films, studying the chemistry that is so important in the atmosphere. To do so they used complementary studies at the CLF and the ISIS Neutron and Muon source.
Technology developed by a collaboration between CLF, Diamond Light Source and STFC Technology allows lasers to be used like tweezers, holding and manipulating micron-sized particles with a focused laser beam. The team used the laser trap system to hold a water droplet in the focus of a laser. This allowed analysis of the surface chemistry of the droplet, using spectroscopy techniques.
They exposed the droplets, doped with a small amount of pollutant material, to ozone to model atmospheric reaction processes, measuring changes in their surface chemistry during the reaction. Neutron investigations allowed the team to measure the rate of loss of the surfactant film by monitoring the reflectivity of the droplet during chemical reactions.
Determination of these properties provides a greater understanding of cloud droplet behaviour in the atmosphere, and its impact on the climate.
At the UK Solar System Data Centre at the Rutherford Appleton Laboratory, Matthew Wild is digitising materials held in the archives and allow the UK research community to take advantage of existing data (a project funded by NERC). Between 1903 and 1942 the Cambridge Solar Observatory took a daily image of the solar disc, each one of which was stored on a glass plate. From 1957 until 1991, data from the UK ionospheric monitoring programme was stored on 35 mm film. Digitisation of both these resources secures and improves access to a valuable environmental data source. These historical records of solar activity help us to understand the likelihood and severity of solar events that can disrupt modern technologies here on Earth – a phenomenon now known as space weather.
Electrical power infrastructure is particularly vulnerable to space weather effects, and of critical importance to modern economies and societies. A space weather event in March 1989 caused the failure of Quebec’s power grid, which went from normal operation to complete shutdown in 90 seconds. Five million people were left without electricity in the nine hours it took to restore operations, and businesses across Quebec were disrupted. The costs incurred were estimated to be over C$ 2 billion, including C$ 13 million of direct damage to the grid. Power systems were also affected elsewhere in the world, with permanent damage to a $12 million transformer in New York and major damage to large transformers in the UK. Since then the power industry has been working to limit the effects of space weather events, but a better understanding of how and why they occur will allow more effective protections to be put in place, and more accurate space weather forecasts.
A £4.6 million investment by BIS is being used to make the UK one of only a few countries who have the capability to forecast space weather. In 2014, the Met Office launched a new space weather forecasting service, using data from both ground-based and satellite instrumentation. Near real-time observations of the solar surface and atmosphere detect active regions that could become the source of large events. Earth’s atmosphere is also monitored, to detect changes related to solar wind variations and the short-term impacts of solar eruptions.
It’s not just the energy industry that will benefit from space weather forecasts. All Global Positioning System (GPS) signals are vulnerable to space weather, with potential impacts on aviation and other transport industries. Communications, pipelines and the mining industry may also be affected, as may any business reliant on modern technology, including the finance sector.
Map of Sea Areas and Coastal Weather Stations referred to in the Shipping Forecast.
(Credit: Emoscopes via Wikimedia Commons)
Weather affects every aspect of modern life, such as transport, agriculture, energy use and leisure. The winter flooding of 2013/4 is estimated to have caused £426 million of flood damage; the summer flooding in 2007 affected 50,000 homes and led to insurance pay outs of £3 billion, with 25% of claims made by businesses.
The UK Met Office uses an IBM supercomputer capable of performing more than 100 trillion calculations per second to create 3000 tailored forecasts every day. These forecasts are delivered to a huge range of customers, including the Government and armed forces, the NHS and businesses, providing information on how the weather might affect things as diverse as hospital admissions, traffic conditions, and military operations. Forecasts save money and lives.
The software used to provide these forecasts relies on a Unified Model (UM) of the climate, parts of which are nearly 25 years old. The MET Office’s UM is also used by other national weather services – including Australia, New Zealand, Norway, India, South Africa and South Korea. The MET Office is now collaborating with STFC’s Hartree Centre and NERC to produce a next generation weather and climate model, a project code named Gung-Ho (the original Chinese meaning of which is “working harmoniously together”).
The new model will be able to make use of the ever increasing power of supercomputers, expected to reach the exascale (containing millions of processors and able to perform a million trillion calculations per second) by the end of the decade. It will give far more accurate forecasts, with much higher resolution – down to individual towns or roads – and maintain the UK’s leadership in environmental prediction.
The winter of 2013/4 was the wettest in 250 years for the UK. Most of us would prefer not to relive it, but climate researchers from the University of Oxford did just that, several thousand times over. They were involved in a citizen science project called weather@home, relying on the donated computing power of thousands of PCs to run two models. The first model was based on the current climate reality, the second on what the climate would be like if climate change didn’t exist. The aim of the project was to determine what influence climate change had on our rotten winter weather.
Experiments that attempt to link climate change to particular extreme weather events are called attribution studies. Because they’re looking at rare events, the models have to be run many thousands of times to deliver a statistically robust result. This work could be done with a super computer, but weather@home relies on distributed computing, using thousands of computer volunteered by members of the public. Participants in the project can get involved in cutting-edge climate research – all they need is a computer.
Weather@home is the latest experiment to be run by the climateprediction.net team, which is part of the RCUK e-science program. Launched in November 2010, with support from the Guardian newspaper, it uses a regional climate model (previous climate prediction models were all global) and can look specifically at UK weather events. Climateprediction.net was launched in 2003, and by 2005 had already published its first results in Nature. The initial experiment used more than 90,000 PCs and revealed that a doubling of pre-industrial atmospheric carbon dioxide levels could lead to more than double the temperature rise originally predicted. This year they’re investigating the drought in the Western United States.
Most extreme weather events take place on a scale that global climate models can’t show. The weather@home system is a family of regional climate models that allow scientists across the world to gauge how climate change is affecting weather locally. Having this information to hand will help us to anticipate what extreme weather events may occur, to plan for the future and to reduce the lives lost and costs incurred by unexpectedly bad weather.