Data assimilation under dramatic growth of observational data and rapid advances in computer performance

By: Guannan Hu

The importance of data assimilation

Data assimilation (DA) is a technique used to produce initial conditions for numerical weather prediction (NWP). In NWP, computer models describing the evolution of the atmosphere are used to predict future weather based on current or previous weather conditions. These models are usually very sensitive to initial conditions, meaning that slight changes in the initial conditions can lead to completely different weather forecasts. The Data Assimilation for the Resilient City (DARE) project is investigating the use of novel observations such as temperature data from vehicles, smartphone data, river camera images, etc. for weather and flooding forecasting. Accurate forecasting of hazardous weather events can help us prepare in advance to protect lives and property and reduce economic losses.

DA is also used to create climate reanalyses, which are gridded datasets providing long-term historical estimates of climate variables covering the globe or a region. These datasets are used to monitor climate change.

The basic idea of data assimilation

Data assimilation blends observations with model forecasts to produce the best estimates of atmospheric and climate variables. For example, the air temperature on campus can be measured by a thermometer or predicted from past temperatures (and other relevant variables such as humidity and wind) using a computer model. Then we obtain the estimates for air temperature from two sources. We assume that the true temperature is somewhere in between. It can therefore be given by a weighted average of the two estimates, where the one with the smaller error has the greater weight as it is considered more reliable. This is a very simple example; the real data assimilation applications are much more complex and involve a huge amount of data.

The assimilation of novel observations

As computers become more powerful and the volume of observational data increases rapidly, data assimilation becomes increasingly important in improving the skills of weather forecasting. The assimilation of novel observations (e.g., geostationary satellite, radar data) has led to great improvement in forecast skill. Unlike thermometers and other conventional instruments, the weather satellite and Doppler radar measure the atmospheric variables indirectly. These observations need to be transformed for use in data assimilation procedures. This will cause so-called representation errors in addition to measurement errors. The observation error (includes representation and measurement errors) been found to be spatially correlated for some observation types, such as geostationary satellite data and Doppler radar radial wind. In practical applications, these error correlations are usually taken into account indirectly in data assimilation systems or removed by thinning the observations. These approaches might be suboptimal as they prevent us from making full use of the observations. Accurately estimating observation error correlations for satellite and radar data can be very challenging. Satellite observations can have a mixture of inter-channel and spatial error correlations. Doppler radar radial wind has the error correlation lengthscales that may not be isotropic; they vary with the height of the observations and the distance of the observations to the radar. In addition, explicitly including correlated observation error statistics may largely increase the computational cost of DA. The increase is mainly caused by the inversion of dense matrices and the parallel communication costs in the computation of matrix-vector products. Another issue with including correlated error statistics is that it may change the convergence behaviour of the minimization procedure in variational data assimilation, which solves a least-square problem.

The more and more wide application of data assimilation

Starting with its use in the NWP, DA is now attracting more and more interest from the wider scientific community. People with different backgrounds and from different research institutes, universities, and weather services around the world are not only committed to developing new methods but are also keen to apply this technique to new areas. For instance, DA can be combined with machine learning. DA can be applied to space weather forecasting and even used to monitor and predict a pandemic!

 

Posted in Climate, data assimilation | Leave a comment

How do we actually run very high resolution climate simulations?

By: Annette Osprey

High resolution modelling

Running very detailed and fine scale (“high resolution”) simulations of the Earth’s atmosphere is vital for understanding changes to the Earth’s climate, particularly extreme events and high-impact weather [1]. However, each simulation is 1) time-consuming to set up – scientists spend a lot of time designing the experiments and perfecting the underlying science, and 2) expensive to run – it may take many months to complete a multi-decade simulation on thousands of CPUs. But the data from each simulation may be used many times for many different purposes.

Under the hood

There is a lot of technical work that is done “under the hood” to make sure the simulations run as seamlessly and efficiently as possible and the results safely moved to a data archive where they can be made available to others. This is the work that we do in NCAS-CMS (the National Centre for Atmospheric Science’s Computational Modelling Services group), alongside our colleagues at CEDA (the Centre for Environmental Data Analysis) and the UK Met Office. My role is to work with the HRCM (High Resolution Climate Modelling) team, helping scientists to set up and manage these very large-scale simulations.

CMS is responsible for making sure the simulation code, the Met Office Unified Model (UM), runs on the national supercomputer, Archer2, for academic researchers around the UK. As well as building, testing and debugging different versions of the code, we need to install the supporting software that is required to actually run the UM (we call this the “software infrastructure”). This includes code libraries, experiment and workflow management tools [2], and software for processing input and output data. This is all specialist code that we need to configure for our particular systems and the needs of our users, and sometimes we need to supplement this with our own code.

Robust workflows

We call the end-to-end process of running a simulation the “workflow”. This involves 1) setting up the experiment (selecting the code version, scientific settings, and input data), 2) running the simulation on the supercomputer, 3) processing the output data, 4) then archiving the data to the national data centre Jasmin, where we can look at the results and share with other scientists. When running very high resolution and/or long-running simulations we need this process to be as seamless as possible. We don’t want to have to keep manually restarting the experiment or troubleshooting technical issues.

Furthermore, the volume of data that is generated from these high resolution simulations is incredibly large. It is too large to store all the data on the supercomputer, and it can sometimes take as long as the simulation to move the data to the archive. The solution therefore, is to process and archive the data as the simulation is running. We build this into the workflow so that it can be done automatically, and we have as many of the tasks running at the same time as possible (this is known as “concurrency”).

The HRCM workflow

 

 

 

 

 

 

 

Figure 1: An example workflow for a UM simulation with data archiving to Jasmin, showing several tasks running concurrently.

The image shows the workflow we have set up for our latest high resolution simulations. We split the simulation into chunks, running 1 month at a time. Once one month has completed, we set the next month running and begin processing the data we just produced. The workflow design means that the processing can be done at the same time as the next simulation month is running. First we perform any transformations on the data, then we begin copying it to Jasmin. We generate unique hashes (checksums) that we use to verify the data copy is identical to the original, so that we can safely delete it, clearing space for forthcoming data. Then we upload the data to the Jasmin long term tape archive, and we may put some files in a workspace where scientists can review the progress of the simulation.

Helping climate scientists get on with science

The advances that we make for the high resolution simulations are made available to our other users, whatever the size of the run. Ideally the workflow design means that the only user involvement is to start the run going. In reality, of course, sometimes the machine goes down, connections are lost, the model crashes, (or the experiment wasn’t set up correctly!) Thus, we have built a level of resilience into our workflow that means that we can deal with failures effectively. So, scientists can focus on setting up the experiment and analysing the results, without worrying too much about how the simulation runs.

References

[1] Roberts, M. J., et al. (2018). “The Benefits of Global High Resolution for Climate Simulation: Process Understanding and the Enabling of Stakeholder Decisions at the Regional Scale” in Bulletin of the American Meteorological Society, 99(11), 2341-2359, doi: https://doi.org/10.1175/BAMS-D-15-00320.1

[2] H. Oliver et al. (2019). “Workflow Automation for Cycling Systems,” in Computing in Science & Engineering, 21(4), 7-21, doi: https://doi.org/10.1109/MCSE.2019.2906593.

Posted in Climate | Leave a comment

The Other Climate Impact Of Aviation

By: Ella Gilbert

In-flight entertainment

Picture yourself in the window seat of an aeroplane, cruising along at 30,000 feet, occasionally admiring the clouds below and watching that cheesy blockbuster you were too shy to go and see in the cinema. If you’re like me, you might also be trying not to think about the impact of this flight on the climate – after all, we are increasingly reminded that travelling by air is one of the most carbon-intensive things we can do. 

But when you hear the phrase ‘climate impacts of aviation’, chances are you think about the emissions of greenhouse gases like carbon dioxide (CO2) from aircraft. Unfortunately, that’s only a third of the story. What you probably don’t think about are the non-CO2 impacts, which have a climate warming effect twice as large. Bad news if you’re already worried about that flight.

Flying from London to Inverness, for example, is equivalent to eating 13 beef steaks if we consider the CO2 emissions alone, while if we consider the non-CO2 effects it’s more like 24. And if you’re flying from London to San Francisco, those numbers rise to a whopper-ing 117 and 224 steaks[1]. Now, how’s that for an in-flight meal?

It’s not just CO2

Many of the non-CO2 impacts of aviation act in opposing directions. Some cool the atmosphere overall, while others warm it. To make matters more complicated, some effects even have different impacts on the climate over different timescales. Because these non-CO2 impacts are so complex and difficult to observe, there is still a great deal of uncertainty around their magnitude.

Advancing the Science for Aviation and Climate (ACACIA) is a multi-institutional European project trying to dispel some of the ambiguity about the various effects of aviation on climate, many of which you can see on the schematic below. At the University of Reading, we’re working on one of the most uncertain impacts: the effect of aviation aerosol-cloud interactions.

Figure 1: – Schematic overview of how aviation impacts the climate. From Lee et al. (2021)

 Aircraft emit lots of gases and particles at the high altitudes where they fly. Their exhaust plumes spew gases like CO2, nitrogen oxide (NOx) and water vapour, as well as soot and sulphur particles into the atmosphere.

Those soot and sulphur particles are also known as aerosols, and they act like tiny seeds on which ice crystals and liquid droplets can grow. In the right conditions, soot aerosols can trigger the formation of ice crystals, which make up cirrus clouds – the wispy, indistinct clouds you see high up in the sky.

A cloudy blanket

Cirrus clouds tend to warm the Earth overall. That’s because they are very thin and so let solar energy travel through them easily, but at the same time absorb lots of outgoing infrared radiation, preventing it from escaping to space and so warming the surface like a blanket (aka the Greenhouse effect). But aerosols change the properties of those cirrus clouds in ways we’re still learning about.

Think about your flight blazing its way through the sky, its engines releasing aerosols into the atmosphere. As long as the conditions are right for cloud formation, the more aerosols there are in the exhaust plume to act as seeds, the more ice crystals that will form in its wake.

Cloud properties like the number, size and mass of ice crystals influence a cloud’s ‘optical thickness’, which describes how easily radiation can travel through it and so the degree to which those clouds warm or cool the atmosphere.

It’s cirrus-ly complicated

Different characteristics of the cloud compete with each other to push the balance in favour of warming or cooling. For instance, clouds containing many small ice crystals will stick around for longer because it takes more time for crystals to get big enough to fall out of the cloud. Very small crystals (a few thousandths of a centimetre across) tend to reflect more solar energy back to space, which has a cooling effect, but most cirrus clouds contain ice crystals that are larger than this, and so have an overall warming effect.

Aircraft can change how much cirrus clouds warm the climate by injecting more aerosols into atmosphere and influencing how many ice crystals form, as well as their size, shape and lifetime.

Aviation-aerosol-cloud interactions are hugely complex and difficult to measure. And because cloud processes push and pull in different directions, we’re still finding out how aircraft aerosol emissions influence the overall characteristics of cirrus clouds. In fact, the question marks are so large that we don’t actually have a precise number to tell us whether their impact is to warm or cool the atmosphere.

Evidence suggests that it’s probably a warming effect, although a recent review study was unable to provide a best estimate of the effect of aerosol-cloud interactions, leaving a conspicuous gap, and an even newer study shows that the warming impact of aviation-aerosol-interactions may be negligible.

One thing at least is clear: it’s still very much a hot topic of research.

Filling in the blanks

Enter, stage left: ACACIA. Our main task at Reading as part of the ACACIA project is to use very fine-scale computer models (called large eddy simulation, or LES) to explore the processes acting on pre-existing cirrus clouds and to find out how they interact with emissions of aviation aerosols like soot.

Understanding these processes will help us quantify the exact effect of aviation aerosols on cirrus clouds: for instance, how do they impact the number of ice crystals that form? How fast do these crystals grow? How quickly do they disappear? How do the prevailing weather conditions impact these effects?

Reducing the non-CO2 impacts of aviation

Hopefully, the work of the ACACIA project will allow us to fill in some of the blanks when it comes to aviation’s effect on climate – the crucial first-step that will allow us to mitigate its effects. Understanding the science is key, and will allow us to develop solutions that reduce the non-CO2 impacts of aviation.

Using aviation fuels that have less soot, avoiding areas where contrails and cirrus clouds preferentially form or avoiding some airspaces entirely might all be helpful solutions – but more work is needed before these strategies can be implemented, especially because there is no clear winner and many proposed options come with trade-offs like increased CO2 emissions.

So – for now at least – your flight won’t be getting diverted away from those spectacular cirrus clouds. I’ll let you get back to watching Fast and Furious 82 now.

 References:

Defra/BEIS Greenhouse Gas Conversion Factors 2019

Kärcher, B. (2018). Formation and radiative forcing of contrail cirrus. Nature Communications 9, 1824 https://doi.org/10.1038/s41467-018-04068-0

Kärcher, B., Mahrt, F. and Marcolli, C. (2021). Process-oriented analysis of aircraft soot-cirrus interaction constrains the climate impact of aviation. Nature Communications Earth & Environment 2, 113. https://doi.org/10.1038/s43247-021-00175-x 

Lee, D. S. and Coauthors (2021). The contribution of global aviation to anthropogenic climate forcing for 2000 to 2018. Atmospheric Environment, 244, 117834. https://doi.org/10.1016/j.atmosenv.2020.117834

Lee, D. S. (2021) Contrails from aeroplanes warm the planet – here’s how new low-soot fuels can help. The Conversation 18 June 2021. Accessed 26/07/2021. Available at: https://theconversation.com/contrails-from-aeroplanes-warm-the-planet-heres-how-new-low-soot-fuels-can-help-162779  

Liou, K.-N. (2005). Cirrus clouds and climate. AccessScience. Retrieved July 26, 2021, from https://doi.org/10.1036/1097-8542.YB050210

Lynch, D.K. (1996) Cirrus clouds: Their role in climate and global change. Acta Astronautica 38 (11), 859-863. https://doi.org/10.1016/S0094-5765(96)00098-7

Niklaß, M., Lührs, B., Grewe, V., Dahlmann, K., Luchkova, T., Linke, F. and Gollnick, V. (2019) Potential to reduce the climate impact of aviation by climate restricted airspaces. Transport Policy 83 102-110. https://doi.org/10.1016/j.tranpol.2016.12.010

Poore, J. and Nemecek, T. (2018) Reducing food’s environmental impacts through producers and consumers. Science 360 (6392) 987-992. https://doi.org/10.1126/science.aaq0216

Shine, K. and Lee, D. S. (2021) Commentary: Navigational avoidance of contrails to mitigate aviation’s climate impact may seem a good idea – but not yet. Green Air News 22 July 2021. Accessed 23/07/2021. Available at: https://www.greenairnews.com/?p=1421

Skowron, A., Lee, D.S., De León, R.R., Ling, L. L. and Owen, B. (2021) Greater fuel efficiency is potentially preferable to reducing NOx emissions for aviation’s climate impacts. Nature Communications 12, 564. https://doi.org/10.1038/s41467-020-20771-3

Timperley, J. (2017) Explainer: The challenge of tackling aviation’s non-CO2 emissions. Carbon Brief 15 March 2017. Accessed 23/07/2021. Available at: https://www.carbonbrief.org/explainer-challenge-tackling-aviations-non-co2-emissions

Timperley, J. (2020) Should we give up flying for the sake of the climate? BBC Future, Smart guide to climate change. Accessed 23/07/2021. Available at: https://www.bbc.com/future/article/20200218-climate-change-how-to-cut-your-carbon-emissions-when-flying

[1] Assuming an ‘average’ emissions intensity for beef per serving of 7.5 kgCO2e after Poore & Nemecek (2018), average flight distances of 723 km and 8629 km for flights to Inverness and San Francisco, respectively, domestic aviation emissions intensity of 133 g and 121 g per passenger kilometre for CO2 and non-CO2 impacts, respectively, and long-haul aviation emissions intensity of 102 g and 93 g per passenger km for CO2 and non-CO2 effects, respectively, after BEIS/Defra emissions conversion factors 2019. See also: https://www.bbc.co.uk/news/science-environment-46459714

https://www.bbc.co.uk/news/science-environment-49349566

 

 

Posted in aviation, Climate, Microphysics | Leave a comment

Soil Moisture Monitoring with Satellite Radar

By: Keith Morrison-Department of Meteorology & Will Maslanka-Department of Geography & Environmental Science

Everyone knows about the impacts from intense and/or prolonged rainfall – flooding, like that experienced in the Thames Basin during the Summer of 2007, and the Winter of 2013/14. Whilst hard-engineering defences (such as raising the height of riverbanks, or construction of flood defences) can be good at dealing with flooding events by keeping water within the river, they can have negative impacts upon natural processes, such as increased deposition and erosion of sediment, and changes to the wildlife habitat. Some hard-engineering practices, such as straightening river meanders, cause river flows to speed up, potentially leading to greater flood risks downstream. Rather than exacerbating flood risk downstream, soft-engineering practices, such as Natural Flood Management (NFM) can be used to slow the flow of water before it enters the watercourse and store the water upstream.

The NERC-funded LANDWISE project (LAND management in lowland catchments for risk reduction) seeks to assess the impact and effectiveness of realistic and scalable land-based NFM measures, to reduce the risk from surface run-off, and groundwater within the Thames Basin. These land-based measures include the planting of more trees in riparian zones (the area along the riverbank), floodplain restoration, and soil and land management changes. The LANDWISE research is done in a multi-disciplinary fashion, by joining together the collective expertise of hydrologists, geologists, farmers, local flood forums, conservation Non-governmental organisations (NGOs) and policy makers, to maximise the impact of the research, and to ensure that the resulting research is greater than the sum of the individual efforts.

One area of focus is that of soil and land management changes; the impact that differing farming practices (such as crop choice and tillage practices) can have on altering infiltration or storage of rainfall in the soil as soil moisture. Soil moisture retrieval from satellite-based radar observations is well established, with various in-service satellite products. However, the resolution of the products are coarse (>1 km), as they are based on spatially averaged measurements from. Instead, this study utilises the higher resolution available from the Sentinel-1 synthetic aperture radar satellite constellation, to work within farmers’ fields, at scales between 1 km and 100 m.

The radar reflectivity of a soil arises from the dielectric contrast at the air/soil boundary, which is set by the soil type and its moisture state. However, moisture retrieval is complicated by the additional sensitivity of the radar to the surface roughness of the soil. To get around this issue, rather than dealing with absolute soil moisture, the LANDWISE project has been looking at relative surface soil moisture (rSSM) using the TU Wien Change Detection Algorithm [1]. This assumes that both the soil type and surface roughness are static parameters. Thus, short-timescale fluctuations present in the backscatter are indicative only of soil moisture changes. By looking at the relative soil moisture, it is possible to create a moisture time series. In this scheme, observations are scaled between the wettest and driest periods, and assuming that the wettest and driest periods are associated with the largest and smallest backscatter values, respectively.

The LANDWISE project has used data from Sentinel-1 to produce an rSSM time series over the Thames basin between October 2015 to December 2020. Some resolution is sacrificed in order to reduce randomly occurring fluctuations, by spatially averaging the imagery onto a 100m grid. Figure 1a shows a snapshot of the rSSM differences across the Thames Basin on 11th of September 2018. A clear band of higher rSSM values can be seen, with lower values to the north and south of it. This band of higher rSSM values can be attributed to a localised shower (Figure 1b) that passed over prior to the time of the satellite acquisition (approx. 18:00 UTC).

Figure 1a: rSSM values across the Thames Basin for the 11th September 2018. Areas denoted in grey are neglected as they are associated with urban areas.

Figure 1b: 12-hourly rainfall accumulation, before the orbit overpass. Rainfall amounts below 0.25mm in 12 hours have not been plotted for clarity.

Rather than looking at a snapshot, Figure 2 looks at the catchment for the river Kennet, a sub-catchment of the Thames Basin, in terms of the temporal changes in rSSM, in both the spatial (top) and in a 7-orbit smoothing (bottom). The expected annual cycle can be seen in the timeseries; an increase in rSSM during the winter before decreasing over the spring and summer as the weather becomes drier, before increasing again during the autumn and winter. However, the soil moisture appears to increase over the summer, when anecdotally it can be expected to be at its lowest during this time of the year. This can be seen during the summer of 2018, when the rSSM values increase slightly over the course of the summer; a period of time when very little rainfall fell over the Thames region [2]. This apparent increase is not due to an increase in soil moisture, but due to an increase in radar backscatter, as the contribution from vegetation (predominately agricultural crops) increases over the growing season, before dropping away after the harvest. Current work is focussed on deriving a correction for seasonal variations in vegetation cover, based on multiple satellite viewing geometries.

Figure 2: (Top) rSSM images for the Kennet Catchment area. Areas denoted in grey are either outside the Kennet Catchment, or have been neglected as urban areas. (bottom) Spatially average rSSM values for the individual orbit (black line) and for a 7-orbit moving average (red line).

References

[1] Bauer-Marshallinger, B., Freeman, V., Cao, S., Paulic, C., Schaufler, S., Stachl, T., Modanesi, S., Massari, C., Brocca, L., and W. Wager, 2019: Toward Global Soil Moisture Monitoring With Sentinel-1: Harnessing Assets and Overcoming Obstacles, IEEE Trans. Geosci. Remote Sens., 57, 520-539, https://doi.org/10.1109/TGRS.2018.2858004

[2] Turner, S., Barker, L., Hannaford, J., Muchan, K., Parry, S., and C. Sefton, 2021: The 2018/2019 drought in the UK: a hydrological appraisal., Weather, 99, 1-6, https://doi.org/10.1002/wea.4003

 

 

Posted in Climate, earth observation, radar, Remote sensing, soil moisture | Leave a comment

Rescuing early satellite data to improve long-term estimates of past weather.

By: Jade Westfoot 

This post is contributed by Jade Westfoot, a year-12 school student who did work experience in the department recently. During her week with us, Jade worked with Drs. Jon Mittaz and Tom Hall on rescuing historic satellite data to make it more usable for historical weather analysis. Jade is passionate about science communication and is interested in both looking up at the sky and back down at Earth, and she is aiming to study a mixture of space science and Earth science!

Today, satellites are a fundamental part of our everyday lives, with a multitude of roles such as navigation, communication, space science and Earth observation. Earth observation is increasingly important in the race to understand our planet to combat and adapt to the climate crisis.

Nearly 1000 Earth observation satellites are available to help with this. Most orbit in sun synchronous or polar orbits, meaning that they fly over locations at a fixed time each day on an orbit that takes them pretty much over the poles. 1000 satellites seems like a lot, but many of them simply image the land beneath them, which is useful, but to understand the state of the atmosphere (in terms of temperature, humidity etc at different heights), more specific sensors are needed. For example, the European Centre for Medium-range Weather Forecasting (ECMWF) currently collects data from roughly 100 useful sensors to inform weather forecasts. However, we haven’t always had this wealth of information: as recently as the 1990s, ECMWF was using fewer than 15 satellites!

As well as forecasting, ECMWF (located in Reading) is an important centre for estimating the weather conditions of the past, which is immensely useful for environment and climate science, as well as engineering and planning. This ‘retrospective weather forecast’ is known in the field as “re-analysis”. The reduction in satellite information back in time is a challenge for re-analysis going back many decades. Mostly, satellite data have been introduced from the late 1970s onwards, but there are more measurements from earlier satellite missions that can be rescued and may be useful.

A good example is the Nimbus programme, NASA’s second programme of experimental Earth observation satellites, with 7 satellites successfully launched between 1964 and 1978. Over the lifetime of the programme the instruments changed, but during the 1960s one of the instruments for atmospheric sounding (used on Nimbus 3) was the MRIR sensor. MRIR was able to take measurements in 5 bands: 1 in the visible spectrum to detect reflected sunlight, and 4 in the infrared spectrum measuring radiation from Earth. Each infrared channel effectively measured different layers of the atmosphere by measuring the signal at different frequencies. For example, the 6.7μm channel was sensitive to radiation emitted by atmospheric water vapour, so by measuring it the MRIR data can be used to estimate the amount of water at a certain height in the atmosphere.

At the time, the Nimbus data was used to refine the accuracy of weather forecasting, and now it is hoped that accessing the data will help ECMWF improve re-analyses to understand long-term weather changes.

How do we know if the early data are valid?

Unfortunately, the age of the data brings with it some problems. Some of the data are missing, and the data that have survived are generally more noisy than modern instruments.

This doesn’t mean it’s useless though! We can infer things from each of the channels individually (such as the presence of clouds). To show their potential, we can combine the MRIR data into a false colour image, which can then be compared to photographs of the Earth. Where do we get photos of the Earth from space in the 1960s? Well, it just happens that some of these satellites were in operation at the same time as NASA’s Apollo missions, during which astronauts took many photos of the whole Earth.

Figure 1: Comparison between the Apollo photo and each of the MRIR channels

For example, looking at the figure you can see that Indonesia and Papua New Guinea are covered by clouds which share similar patterns between the photo and observation. This can be seen on both channel 1 (which measures water absorption) and channel 2, which tries to measure surface temperature but here is blocked by clouds.

The photo and MRIR data don’t perfectly match, which is expected: A photograph is an instantaneous capture of the whole Earth, taken from 400,000 km away, whereas the false colour image is generated from data taken by the satellites as they scan strips of the scene beneath them during their approximately 110 minute orbits. This means that the whole Earth is not captured at one time in the satellite view, so clouds can move and develop in the time it takes to build up the MRIR pictures. However, because of the distances, the MRIR measurements have a higher resolution (45 km) than the Apollo photo.

Comparing satellite data to the Apollo photo boosts our confidence in the data collected, as the similarity between the two independent observations generally confirms the MRIR data have been correctly ‘rescued’.

How will rescued data be used?

Simulations and re-analyses of the climate during the 1960s, including ECMWF’s, don’t take advantage of much old satellite data like that provided by Nimbus. Instead, they rely on in-situ data (measurements taken by ground stations or weather balloons). In situ data are highly informative, but are not available everywhere, particularly in the southern hemisphere. Satellites capture information about the whole planet.

Including the Nimbus data will mean future re-analyses can extend the timescale over which satellite data are used, to more than 50 years, making the re-analyses even more relevant for looking at weather changes over many decades. The more data from different sources we can put into a re-analysis, the more accurate it should become. Having accurate information about past weather will continue to be incredibly important in order to respond to the changing climate.

Posted in Climate, Clouds | Leave a comment

The Future of Arctic Sea Ice

By: Rebecca Frew

It is well documented in scientific studies and the news (recent example here) that the summer extent of Arctic sea ice has been declining rapidly in response to global warming. As the summer sea ice shrinks and retreats Northward, the summer marginal ice zone (MIZ) has been widening and making up a larger proportion of the summer sea ice cover (Ralph et al. 2020).

The MIZ is typically defined as the area in which sea ice is influenced by waves. A more convenient definition often used in studies is the area where the sea ice concentration is between 15% and 80%. With the area above 80% defined as the ice pack where the sea ice floes are more densely packed together, blocking direct atmosphere-ocean interaction. The MIZ is typically small in the winter and grows to maximum extent in the summer as the ice pack fragments and melts, creating smaller and less densely packed floes.

Figure 1: Sea Ice floes. Image Credit:  Kevin Woods, NOAA Pacific Marine Environmental Laboratory. 

This trend of an increasingly MIZ dominated ice cover is projected to continue (Strong & Rigor 2013, Aksenov et al. 2017), transitioning to sea ice free Arctic summers. The relative rates and importance of sea ice processes in the MIZ differs to those in the ice pack. This has consequences for the exchange of heat and salt between the atmosphere and ocean, and ultimately the date at which the Arctic becomes ice free in summer.

Three processes that differ between the MIZ and ice pack are the lateral melt rate (melting on the side of the floes), basal/bottom melting of the floes and breakup of floes caused by waves. The average floe size in the MIZ is smaller than in the ice pack, which means the increases the perimeter to area ratio and promotes faster lateral melting. Ice thickness also tends to be thinner, which increases the rate of basal ice melting in the summer. Smaller, less densely packed sea ice floes in the MIZ are more susceptible to wave breakup, creating smaller floes which tend to melt at a quicker rate.

Figure 2: Arctic sea ice and MIZ extent in the 1980s and the 2010s, from a sea ice model simulation.

In my research, I am investigating the relative importance of growth and melt processes in the MIZ and whether they might change in the future. As part of this, I am considering how they are currently represented in climate models, whether this is accurate and how sensitive the processes are to parameters that are difficult to constrain from observations. For example, a relatively recent area of development in sea ice models is the inclusion of a floe size distribution (Roach et al. 2018). Previously sea ice floes were all one size or ignored in models, now a distribution floe sizes across a range of sizes is calculated within each grid cell, better representing the variation of cm to 100s of kms that is observed. This is important when modelling the MIZ because floe sizes are smaller, and the floe size influences the lateral melt rate.

How lateral melt rate differs in the MIZ from the ice pack, and how it might change in the future are a couple of the questions I am trying to answer. Answering these questions about processes in the MIZ helps to improve projections of Arctic sea ice, and better represent the response of Arctic sea ice to different future scenarios of warming.

References

Aksenov, Y., Popova, E. E., Yool, A., Nurser, A. J. G., Williams, T. D., Bertino, L., and Bergh, J., 2017: On the future navigability of Arctic sea routes: High-resolution projections of the Arctic Ocean and sea ice, Mar. Pol., 75, 300–317, https://doi.org/10.1016/j.marpol.2015.12.027,

Roach, L. A., Horvat, C., Dean, S. M., and Bitz, C. M., 2018: An Emergent Sea Ice Floe Size Distribution in a Global Coupled Ocean-Sea Ice Model, J. Geophys. Res.-Ocean, 123, 4322–4337, https://doi.org/10.1029/2017JC013692

Rolph, R. J., Feltham, D. L., and Schröder. D., 2020: Changes of the Arctic marginal ice zone during the satellite era, The Cryosphere, 14, 1971–1984, https://doi.org/10.5194/tc-14-1971-2020

Strong, C., and Rigor, I. G., 2013: Arctic marginal ice zone trending wider in summer and narrower in winter, Geophys. Res. Lett., 40, 4864–4868, https://doi.org/10.1002/grl.50928

Posted in Arctic, Climate, Climate change, Cryosphere, Polar | Leave a comment

Three Flavours of Pykrete

By: David Livings

Three Flavours of Pykrete

A few years ago, Giles Foden published a novel called Turbulence. Most of the book is about a young meteorologist in the second world war, but there’s a framing story set in the 1980s, in which the same man is sailing from Antarctica to Saudi Arabia in a ship made from a mixture of ice and frozen wood pulp called Pykerete. Pykerete was named after Geoffrey Pyke, who proposed building giant aircraft carriers from such a material. Some of the characters in the book are real people, some are fictionalised versions of real people, and some are completely made up. Pyke and Pykerete were obviously made up …

Or so I thought. I subsequently learnt that Geoffrey Nathaniel Joseph Pyke (1893–1948) really did exist or is else a very elaborate hoax, of which the Oxford Dictionary of National Biography is either a victim or a perpetrator. Not only did Pyke propose building aircraft carriers from ice, but he got taken seriously (at least for a while). Pykrete (sometimes spelt Pykerete or Pykecrete) was named after him, but was not actually his invention. The initial idea of adding wood pulp to ice to increase its strength came from two researchers at the Brooklyn Polytechnic, and its properties were investigated at Pyke’s request by the chemist Max Perutz, who would go on to win the Nobel Prize for Chemistry for his work on the structure of haemoglobin. Perutz published a paper on pykrete in the Journal of Glaciology in 1948.

Last year, in a change of career direction, I moved from meteorological research to software engineering on a sea ice model. As part of my familiarisation with the new field, I thought it would be a good idea to carry out some experiments on the substances being modelled. The first experiment was to investigate the difference between fresh water ice and salt water ice. I made samples of both in plastic pots that originally contained desserts from a supermarket (dimensions: 45 mm diameter at bottom, 70 mm at top, height 88 mm, but only filled to 66 mm for the experiment). The salt water ice contained enough table salt to cover the bottom of the pot to a depth of 1–2 mm before adding the water. Both samples were frozen in a domestic freezer for over 24 h, and then taken out and attacked from the top with a blunt-ended table knife. The knife didn’t penetrate the fresh ice, but just sent up some ice chips. It did penetrate the salt ice, which had a mushier texture.

It was at this point that I remembered Pyke and pykrete, and decided to make some for myself. A good place to start an investigation of pykrete is the web page of Peter Goodeve, which takes a critical look at some of the myths that have grown up about the substance. It also contains links to other sources (some of which perpetuate the myths).

Sources differ over whether the magic ingredient in pykrete is wood pulp, wood powder, sawdust, or wood chips. I had none of these available, but I did have a bag of what described itself as Oatbran & Wheatbran Porridge Oats, so I improvised with that. In one of the pots I mixed dry porridge with just enough water to cover it. I filled the other pot with plain water to the same depth, which was about 30 mm. After freezing both samples, I turned them out of their pots and hit them with a hammer. The plain ice shattered after one blow. The porridge ice survived three blows, only denting. This substance was definitely tougher than plain ice.

This experiment with frozen porridge left a couple of things to be desired. Firstly, the additive wasn’t one of the classic pykrete additives. Secondly, the way in which the amount of additive was determined was rather crude. Perutz reports good results with 4–14% wood pulp.

Recently I was able to obtain some fine sawdust, and decided to repeat the experiment using this and other additives. As well as sawdust and porridge, I followed Goodeve’s suggestion of reverse engineering wood pulp by using torn up newspaper. Rather than tearing up the newspaper (actually three pages from the LRB) I cut it into tiny pieces a few millimetres across. If doing this yourself, allow at least two hours.

I used 20 g of each additive to 200 ml of water. One quarter of the mixture was used to make small samples as in the previous experiment, and the rest was used to make larger samples in another type of dessert pot (sample dimensions: 60 mm diameter at bottom, 77 mm at top, height 40 mm). On making the mixtures, it became clear that the additive settling to the bottom was going to be a problem and also that the experiment last year had used much more than 10% porridge. To guard against settling, I took the mixtures out of the freezer and stirred them every half hour for the first three and a half hours. The following figures show the large samples before and after being hit with a hammer.

Figure 1. Samples of plain ice and the three flavours of pykrete beside their additives. Top left: plain ice. Top right: sawdust. Bottom left: porridge. Bottom right: newspaper.

Figure 2. The results of hitting the samples with a hammer. Top left: the plain ice split after two blows. Top right: the sawdust pykrete survived six blows with little damage. Bottom left: the porridge pykrete split after five blows. Bottom right: the newspaper pykrete survived six blows.

Results from the small samples were similar. The plain ice shattered after one blow, sending fragments flying across the room. The porridge pykrete split after two blows. The sawdust and newspaper pykretes survived three blows.

Conclusion: Sawdust pykrete and newspaper pykrete are tougher than plain ice. Porridge pykrete at the same density is intermediate in strength, but at higher densities is impressive.

Acknowledgements

The author thanks Debbie Turner and Ian Shankland for providing the sawdust.

References

Perutz, M. F., 1948: A description of the iceberg aircraft carrier and the bearing of the mechanical properties of frozen wood pulp upon some problems of glacier flow. J. Glaciol.1, 95–104, https://doi.org/10.3189/S0022143000007796.

Posted in Climate, Cryosphere, History of Science | Leave a comment

Can You Guess The Ingredients Of A Cake?

By: Amos Lawless

“Mmm this cake is lovely, what’s in it?” “Try to guess!” How often have we had that response from a friend or colleague who is proud of the cake they have just baked? And we usually try to guess the main ingredients – “I think there must be ginger or cinnamon. And can I taste lemon?”. But what if that friend persisted and asked you to try to guess all the ingredients – how many eggs they have used, how many grams of sugar are in the cake and how much butter it contains? Maybe you’d think they’d gone a bit crazy! Surely it is impossible to work out all the ingredients just by tasting it? It may sound unreasonable, but this is effectively what we try to do each day to interpret satellite measurements for our weather forecasts.

Weather satellites, besides giving us the nice pictures that we see on television, provide a wealth of other information about the atmosphere. Satellites actually measure the radiation emitted from the atmosphere at different frequencies, and these measurements depend on the properties of that part of the atmosphere that the satellite is looking at, such as its temperature, humidity and winds. It is as if these “ingredients” of the atmosphere are brought together into a “cake” that the satellite can taste. But what we are really interested in knowing is these ingredients. So how can we split the satellite measurement back into its atmospheric ingredients?

Thankfully we have a mathematical technique for doing this, which we call data assimilation. Each satellite instrument can measure at many different frequencies (as if they have many “taste buds” sensitive to different ingredients), so by combining measurements from different satellites in an intelligent way, as well as other more conventional measurements made on the ground, data assimilation helps us to build up a complete picture of the atmosphere all around the globe. This is done every day as part of modern weather forecasting, since knowing what the atmosphere is like now is essential if we are to make accurate forecasts. Most data assimilation techniques are based on finding an optimal combination of what we think is the current state of the atmosphere and our measurements, taking into account the precision of the different pieces of information we have. Writing down the theory of how to do this is fairly easy, but putting into practice is usually much harder.

Scientists of the Data Assimilation Research Centre (DARC) at the University of Reading work on a variety of problems related to data assimilation, from developing new approaches to applying it in practice. Each year, jointly with the National Centre for Earth Observation (NCEO), we organise a training course for scientists round the world to learn about the theory of data assimilation and how to apply it in practice. Lectures from DARC scientists are combined with computer practical exercises, so that participants can learn the theory of data assimilation and get a feel for how different methods perform in practice. Normally the course is held in-person, but this year there was the challenge of whether it was possible to hold it online. So it was that at the start of May our first ever training course on data assimilation using Microsoft Teams took place. Joining were 29 scientists from the UK, Belgium, Bulgaria, Denmark, Germany, Greece, Italy, Spain and the USA, working in universities, research institutes and meteorological forecasting centres.

Figure 1: Lecture by DARC scientist Dr Javier Amezcua

So how did we do it? By now we are already used to giving and listening to talks online, so the lecture part of the course was fairly straightforward. However, an important aspect of a course such as this is that it is interactive, with the possibility to ask questions. Thankfully the chat function worked well here, with participants putting questions in the chat continually and other DARC scientists responding if it wasn’t necessary to interrupt the lecture. Then computer practical exercises took place in breakout rooms, with groups of three participants working together. And during the breaks informal discussions took place using Gather.Town (a very impressive tool that I have only just discovered), including use of a virtual whiteboard to discuss further the mathematics. So what did the participants say about the online delivery? Comments included “I think the format worked really well”, “the arrangements for the remote delivery of the course were excellent”, “I think the practicals were organised well with lecturers rotating and coming to different rooms. That made me feel like I was in a classroom with having constant access to help”. Running this course certainly taught us a lot about how to teach data assimilation online, with lots of lessons learnt for the future. But everybody also realised that there are limitations to such a format. Hopefully next year we will be able to run the course in person again, with the opportunity for more informal discussions over coffee … and plenty of cake!

Figure 2: Online group photo of some of the lecturers and participants.

References

Data Assimilation Research Centre (n.d.), What is data assimilation? https://research.reading.ac.uk/met-darc/aboutus/what-is-data-assimilation/

Data Assimilation Research Centre (2019). Online lecture notes from 2019 training course.
https://research.reading.ac.uk/met-darc/training/ecmwf2019/

Lawless, A.S. (2013), Variational data assimilation for very large environmental problems. In Large Scale Inverse Problems: Computational Methods and Applications in the Earth Sciences (2013), Eds. Cullen, M.J.P., Freitag, M. A., Kindermann, S., Scheichl, R., Radon Series on Computational and Applied Mathematics 13. De Gruyter, pp. 55-90.

Nichols, N.K. (2009), Mathematical concepts of data assimilation. Preprint MPS_2009-04. Department of Mathematics, University of Reading.
http://www.reading.ac.uk/nmsruntime/saveasdialog.aspx?lID=48408&sID=90309

 

Posted in data assimilation, earth observation, Teaching & Learning | Leave a comment

Data Assimilation Improves Space Weather Forecasting Skill

By: Matthew Lang

Over the past few years, I have been working on using data assimilation methodologies that are prevalent in meteorology to improve forecasts of space weather events (Lang et al. 2017; Lang and Owens 2019). Data assimilation does this by incorporating observations from spacecraft orbiting the Sun into numerical solar wind models, allowing for estimates of the solar wind to be updated. These updated solar wind conditions are then used to drive a solar wind model that produces forecasts of the solar wind at Earth. I have shown that over the lifetime of the STEREO-B spacecraft (2007-2014), data assimilation is able to reduce errors in solar wind forecasts by about 31% compared to forecasts performed without (Lang et al. 2020). Furthermore, these data assimilated forecasts can compensate for systematic errors in forecasts produced from in-situ observations alone.

Space weather is the study of the changing environmental conditions in near-Earth space and its impacts on humans and our technologies, both in space and on Earth. One of the major drivers of space weather events is the solar wind, the constant outflow of plasma (the fourth state of matter that can be thought of as a hot, highly magnetised gas) from the Sun’s surface. The solar wind fills the solar system with particles and magnetic field and is constantly bombarding the Earth’s magnetic field.

Coronal Mass Ejections (CMEs) are huge eruptions of plasma from the Sun’s atmosphere that can travel from the Sun to Earth, through the solar wind, in as little as 18 hours and can drive the most severe space weather events. These include depletion of a part of the ionosphere that is responsible for bouncing radio signals around the planet, hence hampering long-distance communication systems.

Figure 1: Transformer damage from a CME that caused a blackout in Quebec in 1989.

Another major impact on human technologies is that the solar wind and CMEs drive changes to the Earth’s magnetic field, inducing electrical currents in the Earth’s atmosphere that have the potential to overload power systems causing transformer fires and widespread blackouts (this occurred in Quebec in 1989 (see Figure 1) and Sweden in 2003). Most of the impacts of a severe space weather event can be mitigated against if accurate forecasts are available. And that’s where data assimilation comes in.

Data assimilation is the combination of information from forecasts and observations of a system to produce an optimal estimate for the true state of that system. It is an invaluable tool in many aspects of modern life, with applications ranging from course correction during the Apollo Moon landing missions, satellite navigation in areas of poor GPS coverage and oil reservoir modelling. The most notable application for this blog, however, is its use in numerical weather prediction where it is a necessary step for producing more accurate starting points for weather forecasts. This reduces the impact of the “butterfly effect”, where a small change can lead to a vastly different outcomes in the future (the famous hypothetical example being that the titular butterfly flaps it’s wings in Japan leading to a tornado forming in the USA). By ensuring that weather forecasts are started as close to the truth as possible, the resulting forecasts will be more accurate over longer periods.

For consecutive 27-day periods (the time taken for the Sun to rotate once at its equator, relative to the Earth) between 2007 and the end of 2014, an empirical solar wind model called MAS (Magnetospherics Around a Sphere) (Linker et al. 1999) was used to generate a prediction of the solar wind conditions, which I call the prior solar wind. Data assimilation is then performed using data from the STEREO-A, STEREO-B and ACE spacecraft to generate a new set of solar wind conditions which I shall refer to as the posterior solar wind. The STEREO spacecraft orbit the Sun at approximately the same radial distance as Earth, however STEREO-A orbits at a rate of 22⁰ faster and STEREO-B 22⁰ slower. The ACE spacecraft is in near-Earth space, between the Earth and the Sun. The prior and posterior solar winds are then input into the simplified solar wind model, HUXt (Owens et al. 2020), which was developed at the University of Reading to produce forecasts for the subsequent 27-days. Finally, the prior and posterior forecasts were compared with a forecast from the STEREO-B spacecraft (the closest in-situ observation of the solar wind behind the Earth during this time), generated by assuming that the solar wind speed observed at STEREO-B will occur at Earth with a time-lag defined by the distance of the spacecraft behind Earth and the rotational speed of the Sun.

Figure 2: Plot showing the Root Mean Squared Errors (± one standard error) of the prior (blue), posterior(red) and STEREO-B corotation (orange) mean 27-day forecast of the solar wind speed over the lifetime of STEREO-B.

The results of these forecasts are summarised in Figure 2, where the mean 27-day solar wind speed forecast from the prior, posterior and STEREO-B corotation forecast are shown over the lifetime of STEREO-B. The posterior and corotation forecasts have lower Root Mean Squared Errors (RMSEs) than the prior forecasts, showing that both are good improvements over the prior forecast at all lead-times. It is understandable that the STEREO-B corotation and posterior forecast are similar, as
both use the observations from the STEREO-B spacecraft in their forecast.

Figure 3:  Solar wind speed forecasts using the HUXt mode where the Sun is in the centre and Earth is in the same location as the ACE spacecraft (black circle). The left one is initialised from the MAS empirical model without data assimilation and the right one is initialised from a data assimilation analysis, where STEREO (black triangles) and ACE observations have been assimilated. A coronal mass ejection (CME) initialised with the same characteristics and released from the Sun at the same time and propagated through the two ambient solar winds yielding very different evolutions of the CME.

A major difference between the STEREO-B corotation and the posterior forecast, however, is that the corotation produces a forecast at a single point as opposed to the posterior forecast which produces a forecast at every point in the model domain (at all radii and longitudes of interest). This is an especially useful feature as accurate specification of the solar wind can influence how CMEs evolve on their way to Earth. Figure 3 shows two CMEs initialised with the same properties (obtained from (Barnard et al. 2020)); the left one is propagated through a ‘prior’ solar wind that has not had data assimilation performed on it, compared to the ‘posterior’ on the right which does include data assimilation. The CME evolution is changed greatly by the different ambient solar winds, with the CME arriving 19 minutes earlier than it was observed at Earth with a data assimilated solar wind, compared to 41 hours late in a solar wind without data assimilation. By comparison, for the same CME, the operational solar wind model used by the Met Office arrived at Earth 10 hours before it was observed (Barnard et al. 2020). This shows that there is great potential in this field for data assimilation to improve forecasts of not only the solar wind, but also the more hazardous coronal mass ejections.

References

Barnard, L., M. J. Owens, C. J. Scott, and C. A. Koning, 2020: Ensemble CME Modeling Constrained by Heliospheric Imager Observations. AGU Adv., 1, https://doi.org/10.1029/2020av000214

Lang, M., and M. J. Owens, 2019: A Variational Approach to Data Assimilation in the Solar Wind. Sp. Weather, 17, 59–83, https://doi.org/10.1029/2018SW001857.

——, P. Browne, P. J. van Leeuwen, and M. Owens, 2017: Data Assimilation in the Solar Wind: Challenges and First Results. Sp. Weather, 15, 1490–1510, https://doi.org/10.1002/2017SW001681.

——, J. Witherington, H. Turner, M. Owens, and P. Riley, 2020: Improving solar wind forecasting using Data Assimilation. http://arxiv.org/abs/2012.06362.

Linker, J. A., and Coauthors, 1999: Magnetohydrodynamic modeling of the solar corona during Whole Sun Month. J. Geophys. Res. Sp. Phys., 104, 9809–9830, https://doi.org/10.1029/1998JA900159.

Owens, M., and Coauthors, 2020: A Computationally Efficient, Time-Dependent Model of the Solar Wind for Use as a Surrogate to Three-Dimensional Numerical Magnetohydrodynamic Simulations. Sol. Phys., 295, 43, https://doi.org/10.1007/s11207-020-01605-3.

Posted in Climate, data assimilation, Weather forecasting | Tagged | Leave a comment

Cold Winter Weather: Despite or Because of Global Warming?

By: Marlene Kretschmer

This year’s winter was cold. There was heavy snowfall across the UK, Europe and parts of the United States including Texas. This severe weather came with significant societal and economic impacts.

Every time cold extremes like this occur, one can almost predict the media headlines.  On the one hand, dubious media will use a regional cold snap to sow doubt about human-made global warming by deliberately misunderstanding the difference between weather and climate. In a similarly absurd manner, other newspapers will state that climate change was responsible for the cold snap. In between, there are debates among scientists about the role of climate change in causing cold extremes. This is where it gets complicated and, hence, interesting.

Climate change manifests itself in different ways. While the increase of CO2 in the atmosphere leads to warmer temperatures globally, there might be indirect mechanisms causing opposite effects regionally. In recent years, researchers have hypothesised that the melting of Arctic sea ice – a direct result of global warming – favours winter cold extremes in the Northern Hemisphere mid-latitudes. In particular, it has been suggested that the decline in Barents and Kara sea ice weakens the stratospheric polar vortex, a band of fast blowing westerly winds circling the Arctic during winter at approximately 15-50 km altitude. Weak phases of the vortex are linked to cold winter weather in Eurasia and North America. In other words, it was proposed that climate change indirectly leads to colder weather. The polar vortex this year was extremely weak, and therefore likely to be the culprit of the cold weather. But are Arctic changes also making these weak vortex phases more likely?

Figure 1: Schematic overview of the different plausible causal mechanisms making it difficult to quantify the influence of autumn Barents and Kara sea ice concentrations (BK-SIC) on the winter stratospheric polar vortex (SPV); sea level pressure over the Ural Mountains (Ural-SLP) and over the North Pacific (NP-SLP), lower-stratospheric poleward eddy heat flux (vT), North Pacific sea ice concentrations (NP-SIC) and El Niño–Southern Oscillation/Madden–Julian Oscillation (ENSO/MJO). The arrows represent assumed causal relationships. (Taken from Kretschmer et al, 2020)

The scientific debate regarding a causal role of Arctic sea ice loss is controversial (see e.g. Cohen et al. 2020, Screen et al. 2018). Scientists face a dilemma. In observational data, a statistically significant signal has been detected. Given the large natural variations in climate data and different possible mechanisms which are difficult to disentangle, it is hard to tell if this signal reflects a causal influence (see also Fig. 1). This is further compounded by partly opposing results from climate model simulations. So far all that can be said conclusively, is that the question of whether the decline of Arctic sea ice is causing a weakening of the polar vortex cannot be answered conclusively.

But should we ignore the potential risk the decline of the Arctic holds for our future weather and climate, just because the current data do not allow a clear statement? The short answer is: No!

We explore this aspect in our latest study (see Kretschmer et al. 2020). In contrast to previous studies, which examined whether the decrease in sea ice causes a weakening of the polar vortex (and thus severe winter weather), we pose a different question. We ask: Assuming there is a causal influence of sea ice loss, what does this imply?

To address this question we use different climate model simulations of the next 100 years. All climate projections agree that sea ice will continue to melt as climate change progresses. This is a sad but unsurprising fact highlighting the need to evaluate possible consequences of a changing Arctic. Based on the model simulation data and using methods from causal inference, we further conclude that the causal effect of Arctic sea ice on the polar vortex is, if it exists, plausibly only very small. However, given that the decrease of sea ice will be huge, this small effect can have large implications. In fact, the climate models project a weakening of the polar vortex as long as the autumn sea ice in the Barents and Kara Sea melts. Whilst this is no definitive proof for a causal influence of sea ice loss, it is consistent with the initial hypothesis. Moreover, we find that once all sea ice is gone, the vortex strengthens again, suggesting there are other, poorly understood mechanisms by which global warming affects the polar vortex and thereby our weather in the mid-latitudes.

More generally, our study calls for more focus on understanding plausible climate-change related risks. Absolute statements about the regional effects of global warming are often not possible, given the complexity of the climate system and often contradictory climate predictions. This forces decision makers to act under large uncertainties. It is therefore necessary for climate scientists to evaluate different causal possibilities (such as an influence of the sea ice loss on the polar vortex) to gain a better understanding of regional climate risks. This also requires the use of different statistical tools and techniques – some of which we apply and discuss in our study.

The next time a cold snap hits Europe the same oversimplistic media headlines can be expected. Hopefully, however, the scientific debate will then have shifted towards a more conditional risk-based understanding of the plausible impacts of the changing Arctic.

References:

Cohen, J., Zhang, X., Francis, J. et al. Divergent consensuses on Arctic amplification influence on midlatitude severe winter weather. Nat. Clim. Chang. 10, 20–29 (2020). https://doi.org/10.1038/s41558-019-0662-y

Screen, J.A., Deser, C., Smith, D.M. et al. Consistency and discrepancy in the atmospheric response to Arctic sea-ice loss across climate models. Nature Geosci 11, 155–163 (2018). https://doi.org/10.1038/s41561-018-0059-y

Kretschmer, M., Zappa, G., and Shepherd, T. G.: The role of Barents–Kara sea ice loss in projected polar vortex changes, Weather Clim. Dynam., 1, 715–730.https://doi.org/10.5194/wcd-1-715-2020

 

Posted in Arctic, Climate, Climate change, Cryosphere, Polar | Leave a comment