Childhood white Christmases: nostalgia or reality?

By Inna Polichtchouk

Nearly every Christmas, I travel back to Finland in the hope of celebrating Christmas Eve in the well below freezing temperatures surrounded by a plethora of snow. My childhood memory of this magical day begins with a cross-country skiing trip in the forest amongst frozen sparkling trees, with low mid-day sun gently thawing the icicles formed on eyelashes by my sublimed breath. The skiing trip is followed by a heart and bone warming sauna and a plunge into a soft snowdrift to even out the body temperature. However, over the past decade this nostalgic childhood memory of a white Christmas has begun to fade and be replaced by a new one where a skiing trip is now a mere walk in the rain and the snowdrift a mud puddle. Is this childhood memory real? I decided to investigate.

Figure 1 shows snow depth at Helsinki-Vantaa airport weather station, averaged over Christmas Eve and Christmas day from 1960 onwards. Having lived in and around the capital area, this weather station is chosen to refresh/test my memory. Other weather stations in and around Helsinki show similar measurements. The snowless Christmases are circled. To be consistent with the Finnish Meteorological Institute (FMI) definition, “snowless” is defined as snow depth below 1 cm. It is clear that out of the 16 snowless Christmases shown, 50% have occurred since the turn of the millennium and following the first 15 years of my life. Before 2000, I only lived through three snowless Christmases. Since moving out of Helsinki in 2005, a snowless Christmas appears to be more of a norm than an anomaly. In particular, the 21st century snowless Christmases have mostly been wet and warm as seen in Figure 2.  Given the data, I hence preserve the right to claim that my childhood was filled with white Christmases.

2017 01 12 Inna Polichtchouk Figure 1

Figure 1. Snow depth (cm) at Helsinki-Vantaa airport weather station, averaged over 24-25 December. Snowless Christmases are circled. “Snowless” is defined as having snow depth below 1 cm. Weather station data is available from Finnish Meteorological Institute (FMI) and also at http://suja.kapsi.fi/fmi-suomi.php.

 

2017 01 12 Inna Polichtchouk Figure 2a

2017 01 12 Inna Polichtchouk Figure 2b

Figure 2. (upper) Daily accumulated precipitation (mm) and (lower) Mean daily temperature (°C) at Helsinki-Vantaa airport weather station, on 24 December. Snowless Christmases are circled.

Does the lack of snow at Christmas in Helsinki reflect the lack of precipitation or is it just a harsh reality of a warming climate? Figure 3 shows the temperature and precipitation anomalies for December of all the years with snowless Christmas. Indeed, it appears that the snowless Christmases are mainly due to anomalously warm December temperatures.

2017 01 12 Inna Polichtchouk Figure 3a2017 01 12 Inna Polichtchouk Figure 3b Figure 3. December anomalies (from 1959-2015 mean for the Helsinki-Vantaa airport weather station) of (upper) Precipitation (%) and (lower) mean daily temperature (°C]. Only years with snowless Christmases are shown.

This trend for snowless Christmases in Helsinki is, of course, likely a manifestation of internal variability. However, I am seriously contemplating celebrating Christmas a month later. Since 1960, the average daily January temperatures have been below freezing and even the smallest amount of January total precipitation of 8.1 mm (in 1972) would produce at least 8 cm of snow cover* – enough to recreate that childhood memory of a white Christmas!

* An approximate rule of thumb is that 1 mm of rainfall produces 1 cm of snow at near zero temperature.

Posted in Climate | Leave a comment

Geoengineering – how could we detect its cooling effect?

By Eunice Lo

Sulphate aerosol injection (SAI) is one of the geoengineering proposals that aim to reduce future surface temperature rise in case ambitious carbon dioxide mitigation targets cannot be met.  Climate model simulations suggest that by injecting 5 Tg of sulphur dioxide gas (SO2) to the stratosphere every year, global surface cooling would be observed within a few years of implementation.  However, temperature fluctuations occur naturally in the climate system too.  How could we detect the cooling signal of SAI amidst internal climate variability and temperature changes driven by other external forcings?

The answer to this is optimal fingerprinting (Allen and Stott, 2003), a technique which has been extensively used to detect and attribute climate warming to human activities.  Assuming a scenario (G4, Kravitz et al., 2011) in which 5 Tg yr-1 of SO2 is injected to the stratosphere on top of a mid-range warming scenario called RCP4.5 from 2020-2070, we first estimate the climate system’s internal variability and the temperature ‘fingerprints’ of the geoengineering aerosols and greenhouse gases separately, and then compare observations to these fingerprints using total least squares regression.  Since there are no real-world observations of geoengineering, we cross-compare simulations from different climate models.  This gives us 44 comparisons in total, and the number of years that would be needed robustly to detect the cooling signal of SAI in global-mean near-surface air temperature is estimated for each of them.

Figure 1(upper) shows the distribution of the estimated time horizon over which the SAI cooling signal would be detected at the 10% significance level in these 44 comparisons.  In 29 of them, the cooling signal would be detected during the first 10 years of SAI implementation.  This means we would not only be able to separate the cooling effect of SAI from the climate system’s internal variability and temperature changes driven by greenhouse gases, but we would be able to achieve this early into SAI deployment. 

2017 01 04 Eunice Lo - dist_TfC1_nomul

2017 01 04 Eunice Lo - dist_BgC1_nomul

Figure 1: Distribution of the estimated detection horizons for the SAI fingerprint using (upper graph) the conventional bi-variate method and (lower graph) the non-stationary detection method.

The above results are tested by applying a variant of optimal fingerprinting to the same problem.  This new method assumes a non-stationary background climate that is mainly forced by greenhouse gases, and attempts to detect the cooling effect of SAI against the warming background using regression (Bürger and Cubasch, 2015).  Figure 1(b) shows the distribution of the detection horizons estimated by using the new method in the same 44 comparisons: 35 comparisons would require 10 years or fewer for the cooling signal to be robustly detected.  This shows a slight improvement from the results found with the conventional method, but the two distributions are very similar.

To conclude, we would be able to separate and thus detect the cooling signal of sulphate aerosol geoengineering from internal climate variability and greenhouse gas driven warming in global-mean temperature within 10 years of SAI deployment in a future 5 Tg yr-1 SAI scenario.  This could be achieved with either the conventional optimal fingerprinting method or a new, non-stationary detection method, provided that the climate data are adequately filtered.  Research on the effects of different data filtering techniques on geoengineering detectability is not included in this blog post, please refer to the article cited at the top for more details.

NOTE: How feasible is the injection of 5 Tg yr-1 SAI scenario? Robock et al. (2009) estimated the cost of lofting 1 Tg yr-1 of SO2 into the stratosphere with existing aircraft to be several billion U.S. dollars per year. Scaling this to 5 Tg yr-1 is still not a lot compared to the gross world product. There are practical issues to be addressed even if existing aircraft were to be used for SAI, but the deciding factor of whether to implement sulphate aerosol geoengineering or not should be its potential benefits and side effects, both on the climate system and the society.  

REFERENCES

Allen, M. R., and P. A. Stott, 2003. Estimating signal amplitudes in optimal fingerprinting, Part I: Theory. Climate Dynamics, 21.5-6: 477-491. 

Bürger, Gerd, and Ulrich Cubasch, 2015. The detectability of climate engineering. Journal of Geophysical Research: Atmospheres, 120.22.

Kravitz, Ben, et al., 2011. The geoengineering model intercomparison project (GeoMIP). Atmospheric Science Letters 12.2: 162-167.

Robock, Alan, et al., 2009. Benefits, risks, and costs of stratospheric geoengineering. Geophysical Research Letters 36.19.

Posted in Aerosols, Climate, Climate change, Climate modelling, Geoengineering | Tagged | Leave a comment

Lakes from space

By Laura Carrea

For the first time satellite technology has been used to make a census of global inland water cover. A number of 117 million lakes, reservoirs and wetlands of area >0.002 km2 have been found summing up to a total area of 5.0 × 106 km2, which corresponds to 3.7% of Earth’s non-glaciated land surface [1]. This was not only an academic exercise as inland water surface area is one of the factors that determine inland water CO2 evasion [2]. Increasingly, lakes are considered to play an important role in global biogeochemical cycling:  they have been found to be an important source of atmospheric carbon dioxide and methane [2], [3], two important greenhouse gases, and also to be disproportionately important carbon sinks via carbon burial in lake sediments [4].

It is not only the number of lakes which cover the Earth’s surface that is of global importance, but also their local extension. A digital map is needed to help distinguish one lake from another and also to distinguish them from other water sources such as rivers and the sea. A combined effort from the European Climate Change Initiative (CCI) has produced a global water bodies map [5] in form of an ‘image’ (Fig. 1) which specifies where there is water and to which water body it belongs. Again this exercise uses satellite data, differing from earlier comprehensive efforts which generated similar datasets, although not to the same level of detail, not from satellite data but from a variety of existing maps, data and information [6]. Clearly efforts of this kind are often limited by the spatial resolution of satellite data and the impossibility of a systematic classification of all inland water due to their sheer number.

2016 12 08 Laura Careea Fig 1 (817 x 708)

Figure 1. Extract of the global water-body map. The area is around Lake Winnipeg in Canada. Each pixel belonging to the same lake has the same color. The colour white corresponds to ‘land’ and the black colour to ‘other inland water’. Each of the other colours corresponds to a specific classified lake (source [5]).

The correct classification of an open water surface or a mixed vegetation area is key to the study of lake processes using satellite data. Satellite technology represents a powerful tool for assessing global lake characteristics that are important for studying other processes that occur in lakes.

Lake ecosystems and the biodiversity they support are important components of the global biosphere. But their stability is threatened by climate change and anthropogenic disturbances [7].

2016 12 08 Laura Careea Fig 2

Figure 2. Lake Tanganyika from Envisat [source http://www.globolakes.ac.uk/] where the effect of warming has been studied and documented [14].

The University of Reading, together with other institutions in UK, is attempting to measure and explain responses of lakes to environmental drivers at a global scale with the aid of satellite technology within the NERC funded Globolakes project.

Lakes are fragile systems that are sensitive to many pressures such as nutrient enrichment, climate change and hydrological modification, making them important ‘sentinels’ of environmental perturbation. According to the Globolakes experts, evidence suggests that climate change might increase the spread of harmful cyanobacterial blooms [8], [9], being one of many possible negative feedbacks of a changing climate. Many studies have shown that the lake surface water temperature (LSWT) and the timing of spring phytoplankton blooms are related to meteorological signals. Generally, lakes are able to spatially and temporally integrate and amplify meteorological signals so that they act as useful sentinels of climate change [10].

LSWT is considered an important parameter reflecting stratification [11] and mixing which are some of the main physical processes occurring in lakes [7], [10]. Changes in temperature and, in turn, stratification influence the ecosystem directly (through differential population responses) and indirectly (via dynamic effects on nutrient distribution). Lake Tanganyika (Figure 2) is an example where warming effects have reduced the exchange rates between shallow and deep water (mixing), showing that warming has influenced the ecosystem [15].

However, the current knowledge of global thermal lake behaviour is incomplete and past studies have reported temperature trends from either in situ or satellite data. Recently a big effort in analysing global LSWT trends from both in-situ and satellite data has been published [13]. However, the results are for summer and for the lake centres only.

Regarding the availability of satellite data, the ARCLake project has generated accurate and consistent spatially resolved LSWT time series for more than 250 large lakes globally from 1991 to 2010 [12], however some of the small and shallow lakes, which may respond differently to climate change [16], have not been included in the lake choice [13].

Within the Globolakes project, a set of 1000 lakes have been selected [14] in order to have a collection of water bodies that span a wide range of ecological settings and characteristics but that are also suitable for remote sensing methods.

The University of Reading is contributing to the Globolakes project generating accurate and consistent LSWT time series of high spatial resolution for the selected 1000 lakes.

 REFERENCES

[1] Verpoorter C., Kutser T., Seekell D.A., Tranvik L.J. (2014) A global inventory of lakes based on high-resolution satellite imagery, Geophys. Res. Lett., 40, 517–521

[2] Raymond, P. A., et al. (2013) Global carbon dioxide emissions from inland waters, Nature, 503, 355–359

[3] Bastviken D., Tranvik L. J., Downing J. A., Crill P. M., Enrich-Prast A. (2011) Freshwater methane emissions offset the continental carbon sink, Science, 331, 50

[4] Dean W.E.W., Gorham E. (1998), Magnitude and significance of carbon burial in lakes, reservoirs, and peatlands, Geology, 26, 535–538

[5] Carrea L., Embury O., and Merchant C.J.  (2015) Datasets related to inland water for limnology and remote sensing applications: distance-to-land, distance-to-water, water-body identifier and lake-centre co-ordinates, Geoscience Data Journal, 2(2), 83-97

[6] Lehner B., Döll P. (2004) Development and validation of a global database of lakes, reservoirs and wetlands, J. Hydrol., 296, 1–22

[7] Schmid M., Hunziker S., Wüest A. (2104) Lake surface temperatures in a changing climate: a global sensitivity analysis, Climate Change, 124, 310-315

[8] Pearl H.W., Huisman J. (2008) Blooms like it hot. Science, 320:57-58

[9] Groetsch, P.M.M., Simis, S.G.H., Eleveld, Marieke A.; Peters, Steef W. M. Cyanobacterial bloom detection based on coherence between ferrybox observations, Journal of Marine Systems, 140, 50-58

[10] Adrian R., O’Reilly C.M., Zagarese H. et al (2009) Lakes as sentinels of climate change. Limnol Oceanogr 54, 2283–2297

[11] Woolway R.I., Maberly S.C., Jones I.D., Feuchtmayr H., A novel method for estimating the onset of thermal stratification in lakes from surface water measurements, Water Resource Research, 50, 5131-5140

[12] MacCallum, S. N., and C. J. Merchant (2012) Surface water temperature observations of large lakes by optimal estimation, Can J. Remote Sens., 38, 25–45

[13] O’Reilly C.M., et al (2015) Rapid and highly variable warming of lake surface waters around the globe, Geophys. Res. Lett., 42, 10773–10781

[14] Politi E., MacCallum S., et al. (2016) Selection of a network of large lakes and reservoirs suitable for global environmental change analysis using Earth Observation, International Journal of Remote Sensing, 37, 3042-3060

[15] Verburg P.,Hecky R.E., Kling H., (2003) Ecological consequences of a century of warming in Lake Tanganyika, Science 301, 505–507

[16] Winslow, L. A., Read J. S., Hansen G. J. A., and Hanson P. C. (2015) Small lakes show muted climate change signal in deepwater temperatures, Geophys. Res. Lett., 42, 355–361,

Posted in earth observation, Hydrology, land use, Remote sensing | Tagged | Leave a comment

How TAMSAT have been supporting African people for over 35 years

By Ross Maidment

The University of Reading’s TAMSAT group ( www.tamsat.org.uk ) have helped pioneer the use of satellite imagery in rainfall estimation across Africa since the early 1980s when the group was first established. Thanks to some bright and innovative minds back in the day, it was quickly realised that the availability of frequent thermal infrared satellite images providing full coverage of the African continent (e.g. Figure 1) could readily be used to observe the cold tops of rain-bearing convective cloud systems, and in turn, produce much needed information on where and how much rainfall has likely fallen. Given the high dependence on rainfall for many socially and economically important activities across sub-Saharan Africa (namely those in agriculture) and the severe lack of raingauges across much of the continent, this satellite-based alternative proved an extremely useful resource to monitor rainfall conditions and forewarn to impending water shocks, such as drought or flooding.

2016 12 01 Ross Maidment - Figure_1

Figure 1. Meteosat thermal infrared image from 29 November 2016, 1800 UTC (Source: www.eumetsat.int). White scenes denote cold cloud tops, while black/dark grey scenes denote the warmer land or sea surface.

More than three and a half decades on, the TAMSAT group still provide operational rainfall estimates (e.g. Figure 2) to all of Africa using a simple, yet effective rainfall estimation algorithm which is based on the use of cold cloud duration (CCD) fields derived from the satellite imagery. Over the years, many African meteorological services and other agencies or organisations have depended on this data for a range of applications such as flood and drought monitoring, famine early warning and weather index-based insurance schemes (e.g. Figure 3), helping millions of people to manage the highly variable rainfall climate that characterises much of the African continent.

2016 12 01 Ross Maidment - Figure_2

Figure 2. The TAMSAT seasonal rainfall anomaly for September-November 2015 with respect to the 1983-2012 climatology (Source: www.tamsat.org.uk). The seasonal anomaly, derived from 10-day total rainfall estimates, shows the impact of the 2015 El Niño event on African rainfall (namely, a wetter East Africa and drier Southern Africa).

 

2016 12 01 Ross Maidment - Figure_3 (752 x 450)

Figure 3. A weather index-based insurance workshop in Zambia where the use of TAMSAT data in insurance products are being discussed (Photo credit: Agrotosh Mookerjee).

In an era where there is an increasing amount of satellite-derived rainfall products, many of which are based on new and highly sophisticated sensors, it is reasonable to assume that the TAMSAT data, given it is based on a relatively simple method, may no longer be as useful. However, the strength of the TAMSAT data lies in its longevity and consistent estimation algorithm. For many applications, a short rainfall time-series makes it very difficult to assess the severity of unexpected changes in rainfall when the average or climatological conditions are not well known. The long time-series of the TAMSAT archive (since 1983) and its operational system ensures that new estimates are created on a regular basis, making the TAMSAT rainfall dataset a highly valuable resource, for both climate-based operational activities (e.g. Black et al 2016) and climate research (e.g. Maidment et al 2015).

So what next for TAMSAT? During the last 12 months, the TAMSAT group have invested huge efforts into overhauling their estimation algorithm, and in doing so, minimising several of the characteristic problems associated with the previous algorithm. In addition, and with the help of collaborators at the International Research Institute at Columbia University, the group have developed novel techniques to provide uncertainty rainfall estimates (which surprisingly many rainfall datasets do not issue) and merging of auxiliary information (such as raingauge measurements) with the satellite data to improve the estimation of rainfall intensities over short time periods. It is planned that these products will be run operationally alongside TAMSAT’s primary rainfall product during 2017, and also in a rainfall monitoring platform currently deployed in several countries across West and East Africa.

The activities described here, amongst others, ensures that TAMSAT are delivering both skillful and reliable rainfall products that are much needed in a region of the world that is deprived of adequate rainfall information, and at the same time, is challenged by marked climate variability and human-induced climate change which is expected to exacerbate current conditions in the near future.

References

Black, E., E. Tarnavsky, R. Maidment, H. Greatrex, A. Mookerjee, T. Quaife, and M. Brown, 2016: The Use of Remotely Sensed Rainfall for Managing Drought Risk: A Case Study of Weather Index Insurance in Zambia. Remote Sens., 8, 342. http://www.mdpi.com/2072-4292/8/4/342/htm

Maidment, R. I., R. P. Allan, and E. Black, 2015: Recent observed and simulated changes in precipitation over Africa. Geophys. Res. Lett., 42, 2015GL065765. http://dx.doi.org/10.1002/2015GL065765

Posted in Africa, Climate, drought, earth observation, Remote sensing | Leave a comment

Flying through the Indian monsoon

By Andy Turner

Forecasting the monsoon in India continues to be a challenge for scientists, both for the season ahead and long into the future, the monsoon being vital for 80% of the country’s annual rainfall and securing the food supply for more than a billion people.

For years scientists have studied climate models from all over the world to understand why they don’t represent the monsoon well. But new observations are needed to help understand the processes involved, including how the land surface affects the timing of the onset and cessation of the rains, and how the deep convective clouds of the tropics work together with the monsoon winds that cover a much larger region.

2016 11 25 Andy Turner - monsoon_seasons (329 x 792)Figure 1. The summer months feature much stronger rains than winter over India (coloured blue over the land) as winds blow onshore from the Indian Ocean. Also shown are sea-surface temperatures. Notice the heavy rainfall on the west coast, caused by the influence of the Western Ghats mountains.

So this year, with the support of funding from the Natural Environment Research Council in the UK and India’s Ministry of Earth Sciences, that is what we started with our INCOMPASS project.

University of Reading and the Indian Institute of Science in Bengaluru (Bangalore) led a team of more than 10 universities, research institutes and operational forecasting centres to observe the monsoon in India from the air and on the ground.

After years of planning, this May we flew the NERC-owned Atmospheric Research Aircraft, a BAe-146 four-engine jet operated by the NCAS Facility for Airborne Atmospheric Measurement to India, as part of one of the largest observational campaigns that NERC has ever funded.

We based ourselves in two locations: the northern city of Lucknow, central to some of India’s most fertile land in the basin of the river Ganges, and in Bengaluru in the southern peninsular, equidistant between the heavy rains of the Western Ghats mountains and the drier land on the east coast. The aircraft returned to the UK in mid-July.

In the north we captured the monsoon onset: taking measurements as the monsoon progresses from east to west over the plains. How do the transitions between wet and dry soils affect the development of monsoon storms? How do the temperature and humidity change as we approach the Thar Desert in north-western India? By analysing our results over the next few years, we’ll find out.

In the south we gave more attention to the heavy rainfall over the Western Ghats mountains. How does this rainfall change during the day? What does the atmospheric boundary layer look like to the west of the mountains, over the Arabian Sea from where the moisture originates? Just as in the north, and as explained in our Planet Earth article, several of our flights took place only a few hundred feet above the ground, giving a spectacular, if bumpy, view of the landscape.

A research aircraft has the advantage of being able to cover large distances and measure large weather systems from different angles. But fuel and supporting the aircraft is expensive and requires a huge commitment of engineers & flight operations and instrument scientists at FAAM and the Met Office. To accompany these measurements, the INCOMPASS project also put in place a series of flux towers (installed by colleagues at CEH) at locations across India (Figure 2).

2016 11 25 Andy Turner - kanpur_flux_tower

Figure 2. Towers like this one installed at IIT Kanpur by INCOMPASS colleagues at CEH will offer long-term measurements of fluxes of heat and moisture passed from the surface to the atmosphere. Photo (c) Andy Turner 2016.

Flux towers are vital for measuring turbulent fluxes of heat and moisture as they migrate from the surface to the atmosphere. The information collected in 2016 and hopefully for years to come will help develop and improve the JULES land model, a vital component of the weather and climate models used by the Met Office. INCOMPASS also gave us the opportunity to use the Department of Meteorology radiosonde equipment. We based this at IIT Kanpur in northern India, close to our Lucknow airport base, and launched more than 100 balloons during the monsoon rains of July.

2016 11 25 Andy Turner - balloon_launch (1068 x 712)

Figure 3. Scientists in India prepare to launch a radiosonde (weather balloon). Photo (c) Andy Turner 2016.

But what happens next? The team of scientists and students will spend the next few years analysing the vast datasets collected. With the Met Office and the NCAS Computational Modelling Services, we will be running model experiments at a variety of high resolutions to compare with our data for the 2016 monsoon. During the field campaign, we ran forecasts for India at 4 km resolution, but in the work to come we hope to perform tests below the kilometre scale to see how well we can re-forecast some of the storms in the monsoon.

 

 

Posted in Climate, Climate modelling, Monsoons, Numerical modelling | Leave a comment

From kilobytes to petabytes for global change research: take the skills survey!

By Vicky Lucas
Institute for Environmental Analytics

If you deal with megabytes of environmental sample data, or gigabytes of sensor data, or terabytes of model data or petabytes of remote sensing data, then I’d like you to take a survey.  If you create, look after, analyse, publish on or manage datasets for global change then I’d like to find out what are the necessary and emerging skills you need.

Global change research and development are pursuits that monitor, analyse and simulate every aspect of environmental developments, from climate to biodiversity to geochemistry to the human attitude and actions on the world.  For global change research to flourish, a range of skills are necessary, increasingly so in the broad area of data intensive digital skills and interdisciplinary work.

2016 11 18 Vicky Lucas Fig 1

Through this survey I would like to find out about existing training programmes and opportunities that you use and value, as well as the skills that are essential in your everyday work or that would make you more efficient or more effective.  Go to survey (closes 22 November).

I work for the Belmont Forum, which is a group of the world’s major and emerging funders, from the National Science Foundation in the USA, to FAPESP in Brazil to the Japan Science and Technology Agency, and everywhere in between.  The Belmont Forum aims to accelerate the delivery of research.  As part of the e-Infrastructure and Data Management project, I am focussed on capacity building, improving the workforce skills and knowledge to enable global change research to thrive.

If you, from anywhere in the world, have wrestled a spreadsheet, frowned at R or Python, filled your hard disk or delighted as you kicked off a month-long model run, then I’d really appreciate 10 minutes of your time to generate a few kilobytes of survey data of my own.

Vicky Lucas
Human Dimensions Champion, Belmont Forum e-Infrastructures and Data Management Project

Background on the Belmont Forum

2016 11 18 Vicky Lucas Fig 2.jpg

The Belmont Forum is a group of the world’s major and emerging funders of global environmental change research. It aims to accelerate delivery of the environmental research needed to remove critical barriers to sustainability by aligning and mobilizing international resources. It pursues the goals set in the Belmont Challenge by adding value to existing national investments and supporting international partnerships in interdisciplinary and trans-disciplinary scientific endeavours. You can also read about the full Belmont Challenge.

Belmont Forum Data Skills and Training Survey:
https://docs.google.com/forms/d/e/1FAIpQLScOKsodT4OF5lMjDpL_sq_FrNDVeK4TB7AYrlMZxGPG__8Thw/viewform

Posted in Academia, Climate, Climate change, Climate modelling, Numerical modelling, Remote sensing, Teaching & Learning | Tagged | Leave a comment

When meteorology altered the course of history (or maybe not)

By Bob Plant

The Battle of Milvian Bridge was fought on 28 October in the year 312 CE. The atmospheric conditions there on that day may have had a critical influence on the course of human history ever since. It’s a defensible opinion. Or they may not have been all that important: that’s a defensible opinion too. On the other hand, perhaps nothing in the least interesting happened, at least nothing of a meteorological nature. Again, that’s entirely plausible. This is a very longstanding and very much ongoing controversy. I’ll try to explain it …

2016 11 10 Bob Plant - Figure 1

Figure 1. A painting of the Battle of Milvian Bridge by Guilio Romano

The Roman empire by the third century had become difficult to control, with civil war becoming increasingly common and internal conflicts becoming increasingly destructive. The emperor Diocletian had tried to stabilize matters by formally dividing the running of the Eastern and Western halves of the empire, each run by its own emperor and junior emperor. This worked fairly well; the frontiers were strengthened and the tax system better organized. However, Diocletian’s abdication due to illness in 305 precipitated yet another succession struggle and civil war.

The ruthless chancer who eventually emerged victorious from this particular mess was the emperor Constantine, who began his bid in 306 in York and completed it by 324. Along the way, the battle of the Milvian Bridge in 312 was fought on the outskirts of Rome against the army of Maxentius, himself a recent usurper but supposedly recognized by Constantine as his superior and the emperor for the Western half of the empire. We’ll come back to the battle in a moment but first we should emphasize why Constantine’s victory matters. He went on to found the city of Constantinople (modern Istanbul) and shifted the imperial capital there. This was important in cementing the division into the eastern and western empires which did so much to shape the last two millennia of European and east-Asian history. But even more far-reaching was that he instigated Christianity as the state religion. This proceeded piecemeal, starting by relaxing and removing the persecutions and proscriptions of the latter part of Diocletian’s time (e.g. the Edict of Milan in 313) but ultimately establishing distinct legal and political advantages for Christians. The consequences of those changes have been enormous. 

It’s not clear whether Constantine’s actions on religion and the state were motivated by his own political calculations, by his genuine religious convictions or by some scrambled mixture of the two. If you were to take a guess anywhere along that spectrum it would not take long to find reputable historians making strongly-expressed arguments in support of it. Nonetheless, he was baptized on his death bed, and consistently professed to be a believer well before that, so it is safe to assume that he was at least a partial convert. An important but deeply controversial question for historians is when and how that conversion happened. And that brings us back to Milvian Bridge.

On the night of the battle Constantine apparently experienced a miraculous dream and around noon on the day itself a miraculous vision. A dream is somewhat difficult to verify or falsify of course, and not really our interest here. The vision was of “a cross of light in the heavens, above the sun, and bearing the inscription, CONQUER BY THIS”. The vision has been considered by many as being pivotal in his conversion, and in helping him to inspire the troops to victory on the day. What are we to make of this? 

We should note where this account actually comes from. It appears in the writings of Eusebius around two decades later, and apparently his source is that he was told so “long afterwards” by none other than the emperor himself. Now, Eusebius is not exactly considered the most reliable of writers on various matters for various reasons, and it is not difficult to find reasons to be cautious. The event is conspicuously absent in an account by Lactantius, for example, despite the fact that there would have been potentially thousands of witnesses to it associated with a full-scale battle on the edge of a major city. On the other hand, why invent something if there are potentially thousands of witnesses who might contradict it? Indeed many historians since, however credulous about miracles, have accepted that there may just be something in it: i.e., supposing that there may have been some natural atmospheric phenomenon, just possibly viewed with, shall we say … a little licence. This line of argument goes back several hundred years itself and is far from settled. To give a flavour of these sober and dispassionate scholarly debates, here is a quote from a very lengthy footnote in Potter’s (2013) biography of the emperor, “For further support of Weiss’s view [claiming a solar halo] see Barnes (2011), though I should note that refusal to accept Weiss’s view does not necessarily indicate an attachment to the Nazi party as is implied in his discussion.”

The meteorological explanation usually put forwards in modern historical articles is that it may have been a “sun dog”. Figure 2  is a very typical picture put forwards to support the notion, taken from the wikipedia page about the battle:

2016 11 10 Bob Plant - Figure 2

Figure 2. Two ‘sun dogs’ (parhelia). Source: Wikipedia.

Perhaps it does look like a plausible explanation, allowing for, shall we say … a little licence. But let’s be a little more careful. If we try searching online for pictures of sun dogs we can easily find many beautiful photographs, but we see relatively little that might resemble the description.

What is a sun dog anyway? A highly-recommended guide to atmospheric optical phenomena is provided by the atoptics.co.uk website. A sun dog, or parhelion, is rather common and occurs when light is scattered by ice crystals in the shape of hexagonal plates that are suitably oriented, with the hexagonal faces aligned close to the horizontal. For a more vertically-extended display, or a “tall sundog”, it is helpful if the plates are not quite horizontal but are wobbling slightly as they fall. However, too much wobble, more than a degree or so, and the halo is lost. Less common, but a little bit more like the description, is a sun pillar which requires the ice crystals to be consistently somewhat tilted and a low angle of the sun. A rare event would be to combine both upper and lower pillars with a substantial horizontally-extended halo such as a parhelic arc which arises from reflections from the vertical faces of the crystal – creating something which can indeed look like a cross. This is not easily achieved, however, and it may be worth adding that Rome around noon in late October is a most unlikely time and place to be able to catch it. To get a sense of just how delicate the conditions would need to be, there is a fun halo simulator available from atoptics that you can play around with and see if you can manage to generate something like the image.

There are entire books and countless articles on this event from historians. And for that matter there are entire books on atmospheric optics. If you really want to develop an informed opinion, you have a lot of reading to do! I’ve simply tried to give a short introduction as a non-expert for non-experts. But I thought it may be interesting for the meteorologically-minded to know something of how and why the possible appearance of an atmospheric optical phenomenon has been such a hotly-debated question.

Posted in Atmospheric optics | Tagged , , | Leave a comment

The value of future observations

By Alison Fowler

The atmosphere and oceans are being routinely observed by a myriad of instruments. These instruments are positioned on board orbiting satellites, aircraft and ships, surface weather stations, and even balloons.  The information collected by these instruments can be used to ensure that modelled weather forecasts adhere to reality using a process known as data assimilation.

2016 11 03 Alison Fowler blog Fig 1

Figure 1:  Data coverage of the AMSU-A instrument, on board 6 different satellites, within a 6 hour time window (copyright ECMWF)

For the observations to be useful it is necessary that:

  • The observations can be compared to the forecast variables (e.g. temperature, humidity and wind)
  • We know the uncertainty in those observations
  • We know the uncertainty in the weather forecast model itself (so we know how much to trust the forecast vs how much to trust the observations)

These fundamentals of data assimilation are continually evolving, as the weather models become more sophisticated and are addressing new societal needs, new instruments are developed and computational resources and mathematical techniques advance.

These different aspects of data assimilation were addressed at the fifth annual international symposium on data assimilation held at the University of Reading during a very hot week in July 2016. This symposium brought together 200 scientists from 15 countries spread across four different continents and received sponsorship from NCEO, the Met Office and ECMWF.

2016 11 03 Alison Fowler blog Fig 2 (800 x 533)

Figure 2: Participants of the Fifth annual international symposium on data assimilation (photograph copyright (C) Stephen Burt).

This symposium comprised of 10 different sessions, one of which focused on the particular problem of assessing the value of observations. This is important not only for evaluating which (of the very many) observations are most important for providing an accurate weather forecast but also for designing instruments which are able further to reduce the uncertainty in the forecast. This latter problem is particularly difficult due to the fast pace at which data assimilation systems are changing, which means that by the time the instrument is operational  (possibly in a few decades time) its value may be very different than if the data could be assimilated today.

There are many possible metrics for assessing the value of observations. Some are based on how sensitive the forecast skill is to the value of the observations, others try to quantify the amount of information in the observations for reducing the uncertainty in our knowledge of the current state of the atmosphere. Computing these metrics before the instrument is built and the data is available relies on accurate estimates of the error characteristics of the instrument and its relationship to the model variables and, hence, is very challenging.

It is clearly difficult to describe the value of a future observation unequivocally by a single figure. Instead we need to provide insight, through on-going research, as to how the value of observations are sensitive to changes in the ever evolving data assimilation system. There will be much to discuss at the next symposium!

Posted in Climate, earth observation, Measurements and instrumentation, Numerical modelling, Remote sensing | Tagged | Leave a comment

How can a hurricane near the USA affect the weather in Europe?

By John Methven

It may seem bizarre that processes occurring within clouds near the USA, involving tiny ice crystals and water droplets, can have an influence on high-impact weather events thousands of kilometres away in Europe, and our ability to predict them days in advance. However, this is the fundamental nature of the atmosphere as a chaotic dynamical system. Information is transferred from one region to another in the atmosphere through wave propagation and transport of properties within the air, such as water vapour. Weather systems developing over the North Atlantic and hitting Europe are intimately related to large-amplitude meanders of the jet stream, known as Rossby waves. Characteristic weather patterns grow in concert with the waves, and the jet stream acts as a wave guide, determining the focus of the wave activity at tropopause-level (about 10 km altitude). Rossby wave energy transfers downstream rapidly, amplifying the meanders and the weather events associated with them.

Perhaps even an even greater stretch for the imagination is the idea that you could direct research aircraft and weather balloons into the jet stream across the Atlantic to understand how a tropical cyclone near the USA can influence a wind storm in Scotland and then flooding in Norway. Nevertheless, this is what was attempted last month in the North Atlantic Waveguide and Downstream Impacts Experiment (NAWDEX). The experiment involved five research aircraft equipped with lidar, radar and dropsondes (measurement devices that fall on small parachutes from the aircraft) for measuring high resolution cross-sections of winds, temperature and humidity. The aircraft furthest upstream was the NASA Global Hawk UAV (flying from the USA as part of the NOAA SHOUT programme). The German HALO (cloud radar, water vapour lidar and dropsondes) and DLR Falcon aircraft (two wind lidars) were based from Iceland for the whole campaign month (16 September to 16 October this year), as was the French SAFIRE Falcon (cloud radar and lidar). The UK FAAM BAe 146 aircraft (cloud microphysics and dropsondes) joined from East Midlands airport. In addition, more than 300 weather balloons were launched from ground sites spanning from Canada in the west to Norway and Italy in the east, Svalbard in the north and the Azores in the south. Even commercial ships crossing the mid-Atlantic launched balloons for NAWDEX. A scientific experiment on this scale cannot be conducted by one nation. Contributing countries included Germany, France, UK, Switzerland, USA, Canada, Iceland and Norway as well as the met services from countries who launched weather balloons as part of the EU-funded mechanism EUMETNET (UK, Denmark, Norway, France, Portugal and Italy). International cooperation is achieved through a common purpose and determination and also with coordination through a working group of the WMO World Weather Research Programme.

2016 10 21 John Methven - BBC _91444006_mediaitem91444005

On board one of the NAWDEX research flights: courtesy BBC News (www.bbc.co.uk/news/science-environment-37508156)

A golden opportunity emerged during the second week of the NAWDEX campaign, as tropical storm Karl moved northwards from the Bahamas and was forecast to interact with the jet stream with highly uncertain outcomes in terms of high impact weather for Europe 5-6 days later. The Global Hawk was first to the scene with a comprehensive coverage of dropsondes on the night of 22/23 September.

Tropical storms move slowly and Karl was sampled again off the east coast of the USA on 24/25 September. Then there was a dramatic change as Karl interacted with the jet stream on the 26th undergoing a process called “extratropical transition” when the cyclone also intensified. The German HALO aircraft was able to reach the centre of the storm during this critical stage from its base in Iceland.

Following transition, the jet stream on the southern flank of cyclone Karl became much stronger and the whole system was stretched out and advanced very rapidly towards the north of Scotland where it was intercepted by both the UK FAAM aircraft and German DLR Falcon aircraft coming together above Torshavn (Faroe Islands) from their bases in East Midlands and Iceland.

Above the Scottish north coast the jet maximum was observed to be 89 m/s (200 mph) which is unusually strong for the time of year and was associated with severe winds at the surface across northern Scotland. I was lucky enough to be on the flight. So was BBC science correspondent, David Shukman, who reported on his experience on BBC TV and website. The jet streak (a locally intense section of the jet stream) moved into Norway and was followed by two days of persistent heavy rainfall and flooding as a moist air stream from the mid-Atlantic was drawn northwards to meet the jet stream on the Norwegian coast.

What do we hope to learn from the sequence of research flights?
We will focus on detailed measurements of cloud physical properties and their relation to the structure of the winds and temperature in the vicinity of the jet stream and its evolution over days to weeks.

Recent research has shown that forecast ‘busts’ (where skill is much lower than usual) for Europe share a common precursor 5-6 days beforehand; there is a distinct Rossby wave pattern with a more prominent ridge (northwards displacement of the jet stream) across the eastern USA. The reasons for these forecast busts are not known, but it is hypothesised that the representation of diabatic (cloud and radiative heating) processes, over the USA and Atlantic, lowers the predictability in this situation. Diabatic processes create shallow temperature structures either side of the tropopause, tending to enhance tropopause sharpness and the jet stream wind maximum. Recent theory indicates that tropopause sharpness can have a far-reaching influence on Rossby wave propagation and thereby downstream forecast error. However, the sharpness is not well represented in both models and satellite data due to poor vertical resolution in the tropopause region. We already know from a first look at the flight data that the tropopause was observed to be sharper than represented in forecasts, but much more scientific investigation is needed to understand why.

The same physical processes that are poorly represented in weather forecast models also constitute a major source of uncertainty in climate model projections and make the prediction of changes in regional precipitation and wind patterns in response to global warming very uncertain. Among them are cloud microphysics, cloud radiative feedbacks, and turbulent boundary layer dynamics, which are parameterized in both weather and climate models. NAWDEX will help us to improve our representation of these processes by furthering our understanding of how the physical processes influence synoptic-scale dynamics, thereby affecting not only mesoscale sensible weather but also large-scale weather regimes. Only once the physical processes are represented well in models, can the excitation and maintenance of large-scale patterns on seasonal timescales by global teleconnections, or downscaling of climate information for the Atlantic/European sector, be tackled with confidence through numerical simulation.

If the idea of taking a research aircraft into the jet stream to discover more about the atmosphere excites you, then you should consider a career in atmospheric physics. There are an ever expanding range of employment opportunities with specific environmental physics skills, as well as more general openings for graduates with the problem solving and analytical capability that training in physics and mathematics brings. Our Department offers undergraduate degrees in Environmental Physics and Meteorology. If you have already done a first degree (perhaps in Physics, Mathematics or other physical science) you could consider the MSc in Atmosphere, Oceans and Climate, or entering directly onto the PhD programme.

 

Posted in Environmental physics, Measurements and instrumentation, Numerical modelling, University of Reading, Weather forecasting | Leave a comment

Where is the high probability?

By Peter Jan van Leeuwen

To determine the uncertainty in weather and climate forecasts an ensemble of model runs is used, see Figure 1. Each model run represents a plausible scenario of what the atmosphere or the climate system will do, and each of these scenarios is considered equally likely. The underlying idea is that these model runs describe in an approximate way what the probability is of each possible weather or climate event. So if several model runs in the ensemble cluster together, meaning they forecast a similar weather event, the probability of that weather event is larger than a weather event with only one model run, or none at all.

2016 10 21 Peter Jan van Leeuwen Fig 1

Figure 1. An example of an ensemble of weather forecasts, from the Met Office webpage 

Mathematically we say that each model run represents a probability mass, and the total probability mass of all model runs is equal to 1. The probability mass is the probability density times the volume, in which the volume is the volume in temperature, velocity etc of the set of events we are interested in. The probability density tells us how the probability is distributed over the events. For instance the probability that the temperature in Reading is between T1=14.9 and T2=15,1 degrees is equal to

Prob( T1<T<T2)  =  p(T=15) (T2-T1)

… where (T2-T1) is the volume and p(T=15) is the probability density.

There are an infinite number of possible weather events, each with a well-defined probability density, but we can only afford a small number of model runs, typically 10 to 100. So the question becomes how should we distribute the model runs to best describe the probability of all these events? Or, in other words, where are the areas of high probability mass?

You might think that we should put at least one model run at the most likely weather event. But, interestingly, that is incorrect. Although the probability density is highest, the probability mass is small because the volume is small, much smaller than in other parts of the domain with different weather events. How does that work? It all has to do with the high dimension of the system. With that I mean that the weather does consist of the temperature in Reading, in London, in Paris, in, well, in all places of the world that are in the model. But not only temperature is important in all these places, also humidity, rain rate, wind strength and direction, clouds, etc. etc., and all of these in all places that are in the model. So the number of variables is enormous, typically a 1,000 million! And that does strange things to probability.

So where should we look for the high probability mass? The further you move away from the most likely state, the smaller the probability density becomes. But, on the other hand, the volume grows, and it grows quite rapidly. It turns out that an optimum is reached and the maximal probability mass area is found at a distance from the most likely state of the number of places the model contains (Reading, London, etc), times the number of model variables (temperature T, velocity u,  etc) at each of these places. This means that the optimal positions for the model runs would be in that volume, so far away from the most likely state, as illustrated in Figure 2.

2016 10 21 Peter Jan van Leeuwen Fig 2

Figure 2. The blue curve shows the probability density of each weather event, with the most likely event having the largest value of the blue curve. Note that that curve decreases very rapidly with distance from this peak. The green curve denotes the volume of events that have the same distance from the most likely weather event. That curve grows very rapidly with distance from that most likely event. The red curve is the product of the two, and shows that the probability mass is not at the most likely weather event, but some distance away from that. For real weather models that distance can be huge …

So we find that the model runs should be very far away from the most likely weather event, in a rather narrow range it turns out. But what does that mean, and how do we get the models there? Very far away means that the total distance from the most likely weather event, measured in all variables and at all places that are in the model, is large. Looking at each variable individually, so for example the temperature in Reading, the distance is rather small. So, as shown in Figure 1, the differences are not that large if we only concentrate on a single variable. And how do we get the model runs there? Well, if the initial spread in the model runs was chosen to represent the uncertainty in our present estimate of the weather well, the model, if any good, should do the rest.

So you might ask what all this fuss is about. The answer is that to get the models starting in the right place to get the correct uncertainty is not easy. For that we need data assimilation, the science of combining models and observations. And for data assimilation these distances are crucial. But that will be another blog …

Posted in Climate, Numerical modelling, Weather forecasting | Tagged | Leave a comment