THE BRAVE PROJECT – Annual meeting, January 2017, Ghana

By Galine Yanon – Walker Institute

The overall objective of the BRAVE project is to quantify the impacts of climatic variability and change on groundwater supplies from low storage aquifers in Africa.

More than 40 institutions from Burkina Faso, Ghana and UK attended the BRAVE Project annual general meeting, held in Accra, Ghana, 24-26 January 2017. These institutions are direct and indirect partners of the Project. The meeting commenced with a project meeting and discussions around work packages, from WP1 to WP5.

After the opening session and presentation of the Agenda, Professor Rosalind Cornforth, PI of the project and Director of the Walker Institute, presented an overview of the UpGro consortium (composed of 5 projects) and the BRAVE Project in order to give a better understanding of the ongoing work to the participants. This overview was really important because some partners were not directly engaged in the different activities of the project.

One of the objectives of BRAVE is the capacity building of early career researchers and the benefit of the project for the communities engaged through in-country partners (CARE International in Ghana, Christian-Aid and Reseau Marp in Burkina Faso). So, after the overview, discussions focused on the different work packages of the Project:

  • WP1 presented by Dr Henny Osbahr, into the Understanding of vulnerability in the communities;
  • WP2 presented by Dr Galine Yanon, into the Understanding of Decision-making Pathways, Governance Structures and Institutional Influence;
  • WP3 and WP4, presented by David Macdonald, into the Improvement of our understanding of the hydroclimate and strategic planning and adaptive capacity; and
  • WP5 by Professor Rosalind Cornforth, Delivering Evidence and Demonstrating Resilience.

The Program Coordination Group (PCG) across all project of the Consortium was also presented to participants.

Further information on later stages in the meeting and its outcomes can be found in Galine’s reports on the Walker Institute website.

Posted in Africa, Conferences, Hydrology, Teaching & Learning | Tagged | Leave a comment

Measuring radiation with aircraft

By Peter Hill

In my career as an atmospheric scientist I’ve relied on observational data from a wide range of sources including satellite imagery, surface measurements, ground-based and satellite based radar, and aircraft measurements. Last July I had my first opportunity to contribute to the available data when I took part in the aircraft field campaign for the EU-funded DACCIWA (Dynamics-Aerosols-Chemistry-Cloud Interaction in West Africa) project.

The DACCIWA project is investigating pollution in southern West Africa (SWA) and how this affects health and the regional climate. This region is very reliant on agriculture which is highly sensitive to the amount of rainfall. Any changes in rainfall due to pollution may have important implications for the worlds’ supply of cocoa, not to mention the livelihoods of millions of people in SWA.

My role in DACCIWA is focused on atmospheric radiation; how sunlight and thermal radiation interact with the atmosphere over SWA. Radiation is important because it is a key component of the atmospheric energy budget (see Figure 1). Consequently radiation changes can lead to circulation changes which may in turn affect more obviously societally relevant processes such as precipitation.2017 01 19 Peter Hill Fig 1Figure 1: Key terms in the atmospheric energy budget for southern West Africa (defined here as 8°W – 8°E and 5 – 10°N). Units are W m-2. Values shown are June-July means for 2000-2015. Divergence of dry static energy and sensible heating are derived from ERA-Interim. Radiation values (i.e. shortwave SW heating and longwave LW cooling) are from the CERES-EBAF dataset and latent heating is from the TRMM dataset. Adapted from Hill et al, 2016.

Pollution particles (aerosols) reflect and absorb radiation directly, but may also affect radiation by changing cloud properties. Radiation measurements are important to understand the extent to which both occur. These measurements can also be used as an additional check on aerosol and cloud measurements made during the campaign by both the aircraft and by satellites. Radiative transfer is a relatively well understood process. If the measured cloud and aerosol properties are correct, we should be able to predict the measured radiation quite accurately using computer-based models.

The campaign involved three aircraft, two of which were equipped with instruments to measure radiation (in addition to many other instruments). Each had two pyranometers, which measure solar radiation, and two pyrgeometers, which measure thermal radiation. One of each was mounted above the aircraft pointing upwards to measure downwelling radiation and one of each was mounted below the aircraft pointing downwards to measure upwelling radiation. During the campaign, out of a total of 50 flights, seven were made with the primary objective of making radiation measurements. I was lucky enough to fly on the British Antarctic Survey Twin Otter (Figures 2 and 3) on one such flight, which was an exhilarating and surprisingly comfortable experience.

2017 01 19 Peter Hill Fig 2

Figure 2: The British Antarctic Survey Twin Otter aircraft outside the hangar at Gnassingbé Eyadéma airport in Lomé, Togo during the DACCIWA aircraft field campaign.

2017 01 19 Peter Hill Fig 3

Figure 3: Airborne view of Lomé from the Twin Otter aircraft. Note the haze due to pollution.

Together with observations from three highly-instrumented field sites, the aircraft campaign has provided a wealth of measurements. These measurements provide an indispensable dataset for understanding pollution, weather, and climate in this region. The measurements will facilitate lots of exciting scientific research and scientists across Europe and SWA will be working with this dataset for at least the next two years. Keep up to date with our progress via the DACCIWA newsletter, or twitter feed.


Hill, P. et al, 2016. A multisatellite climatology of clouds, radiation, and precipitation in southern West Africa and comparison to climate models  –, where further details can be found.

Posted in Aerosols, Africa, Atmospheric chemistry, Climate, Climate change, Climate modelling | Tagged | Leave a comment

Childhood white Christmases: nostalgia or reality?

By Inna Polichtchouk

Nearly every Christmas, I travel back to Finland in the hope of celebrating Christmas Eve in the well below freezing temperatures surrounded by a plethora of snow. My childhood memory of this magical day begins with a cross-country skiing trip in the forest amongst frozen sparkling trees, with low mid-day sun gently thawing the icicles formed on eyelashes by my sublimed breath. The skiing trip is followed by a heart and bone warming sauna and a plunge into a soft snowdrift to even out the body temperature. However, over the past decade this nostalgic childhood memory of a white Christmas has begun to fade and be replaced by a new one where a skiing trip is now a mere walk in the rain and the snowdrift a mud puddle. Is this childhood memory real? I decided to investigate.

Figure 1 shows snow depth at Helsinki-Vantaa airport weather station, averaged over Christmas Eve and Christmas day from 1960 onwards. Having lived in and around the capital area, this weather station is chosen to refresh/test my memory. Other weather stations in and around Helsinki show similar measurements. The snowless Christmases are circled. To be consistent with the Finnish Meteorological Institute (FMI) definition, “snowless” is defined as snow depth below 1 cm. It is clear that out of the 16 snowless Christmases shown, 50% have occurred since the turn of the millennium and following the first 15 years of my life. Before 2000, I only lived through three snowless Christmases. Since moving out of Helsinki in 2005, a snowless Christmas appears to be more of a norm than an anomaly. In particular, the 21st century snowless Christmases have mostly been wet and warm as seen in Figure 2.  Given the data, I hence preserve the right to claim that my childhood was filled with white Christmases.

2017 01 12 Inna Polichtchouk Figure 1

Figure 1. Snow depth (cm) at Helsinki-Vantaa airport weather station, averaged over 24-25 December. Snowless Christmases are circled. “Snowless” is defined as having snow depth below 1 cm. Weather station data is available from Finnish Meteorological Institute (FMI) and also at


2017 01 12 Inna Polichtchouk Figure 2a

2017 01 12 Inna Polichtchouk Figure 2b

Figure 2. (upper) Daily accumulated precipitation (mm) and (lower) Mean daily temperature (°C) at Helsinki-Vantaa airport weather station, on 24 December. Snowless Christmases are circled.

Does the lack of snow at Christmas in Helsinki reflect the lack of precipitation or is it just a harsh reality of a warming climate? Figure 3 shows the temperature and precipitation anomalies for December of all the years with snowless Christmas. Indeed, it appears that the snowless Christmases are mainly due to anomalously warm December temperatures.

2017 01 12 Inna Polichtchouk Figure 3a2017 01 12 Inna Polichtchouk Figure 3b Figure 3. December anomalies (from 1959-2015 mean for the Helsinki-Vantaa airport weather station) of (upper) Precipitation (%) and (lower) mean daily temperature (°C]. Only years with snowless Christmases are shown.

This trend for snowless Christmases in Helsinki is, of course, likely a manifestation of internal variability. However, I am seriously contemplating celebrating Christmas a month later. Since 1960, the average daily January temperatures have been below freezing and even the smallest amount of January total precipitation of 8.1 mm (in 1972) would produce at least 8 cm of snow cover* – enough to recreate that childhood memory of a white Christmas!

* An approximate rule of thumb is that 1 mm of rainfall produces 1 cm of snow at near zero temperature.

Posted in Climate | Leave a comment

Geoengineering – how could we detect its cooling effect?

By Eunice Lo

Sulphate aerosol injection (SAI) is one of the geoengineering proposals that aim to reduce future surface temperature rise in case ambitious carbon dioxide mitigation targets cannot be met.  Climate model simulations suggest that by injecting 5 Tg of sulphur dioxide gas (SO2) to the stratosphere every year, global surface cooling would be observed within a few years of implementation.  However, temperature fluctuations occur naturally in the climate system too.  How could we detect the cooling signal of SAI amidst internal climate variability and temperature changes driven by other external forcings?

The answer to this is optimal fingerprinting (Allen and Stott, 2003), a technique which has been extensively used to detect and attribute climate warming to human activities.  Assuming a scenario (G4, Kravitz et al., 2011) in which 5 Tg yr-1 of SO2 is injected to the stratosphere on top of a mid-range warming scenario called RCP4.5 from 2020-2070, we first estimate the climate system’s internal variability and the temperature ‘fingerprints’ of the geoengineering aerosols and greenhouse gases separately, and then compare observations to these fingerprints using total least squares regression.  Since there are no real-world observations of geoengineering, we cross-compare simulations from different climate models.  This gives us 44 comparisons in total, and the number of years that would be needed robustly to detect the cooling signal of SAI in global-mean near-surface air temperature is estimated for each of them.

Figure 1(upper) shows the distribution of the estimated time horizon over which the SAI cooling signal would be detected at the 10% significance level in these 44 comparisons.  In 29 of them, the cooling signal would be detected during the first 10 years of SAI implementation.  This means we would not only be able to separate the cooling effect of SAI from the climate system’s internal variability and temperature changes driven by greenhouse gases, but we would be able to achieve this early into SAI deployment. 

2017 01 04 Eunice Lo - dist_TfC1_nomul

2017 01 04 Eunice Lo - dist_BgC1_nomul

Figure 1: Distribution of the estimated detection horizons for the SAI fingerprint using (upper graph) the conventional bi-variate method and (lower graph) the non-stationary detection method.

The above results are tested by applying a variant of optimal fingerprinting to the same problem.  This new method assumes a non-stationary background climate that is mainly forced by greenhouse gases, and attempts to detect the cooling effect of SAI against the warming background using regression (Bürger and Cubasch, 2015).  Figure 1(b) shows the distribution of the detection horizons estimated by using the new method in the same 44 comparisons: 35 comparisons would require 10 years or fewer for the cooling signal to be robustly detected.  This shows a slight improvement from the results found with the conventional method, but the two distributions are very similar.

To conclude, we would be able to separate and thus detect the cooling signal of sulphate aerosol geoengineering from internal climate variability and greenhouse gas driven warming in global-mean temperature within 10 years of SAI deployment in a future 5 Tg yr-1 SAI scenario.  This could be achieved with either the conventional optimal fingerprinting method or a new, non-stationary detection method, provided that the climate data are adequately filtered.  Research on the effects of different data filtering techniques on geoengineering detectability is not included in this blog post, please refer to the article cited at the top for more details.

NOTE: How feasible is the injection of 5 Tg yr-1 SAI scenario? Robock et al. (2009) estimated the cost of lofting 1 Tg yr-1 of SO2 into the stratosphere with existing aircraft to be several billion U.S. dollars per year. Scaling this to 5 Tg yr-1 is still not a lot compared to the gross world product. There are practical issues to be addressed even if existing aircraft were to be used for SAI, but the deciding factor of whether to implement sulphate aerosol geoengineering or not should be its potential benefits and side effects, both on the climate system and the society.  


Allen, M. R., and P. A. Stott, 2003. Estimating signal amplitudes in optimal fingerprinting, Part I: Theory. Climate Dynamics, 21.5-6: 477-491. 

Bürger, Gerd, and Ulrich Cubasch, 2015. The detectability of climate engineering. Journal of Geophysical Research: Atmospheres, 120.22.

Kravitz, Ben, et al., 2011. The geoengineering model intercomparison project (GeoMIP). Atmospheric Science Letters 12.2: 162-167.

Robock, Alan, et al., 2009. Benefits, risks, and costs of stratospheric geoengineering. Geophysical Research Letters 36.19.

Posted in Aerosols, Climate, Climate change, Climate modelling, Geoengineering | Tagged | Leave a comment

Lakes from space

By Laura Carrea

For the first time satellite technology has been used to make a census of global inland water cover. A number of 117 million lakes, reservoirs and wetlands of area >0.002 km2 have been found summing up to a total area of 5.0 × 106 km2, which corresponds to 3.7% of Earth’s non-glaciated land surface [1]. This was not only an academic exercise as inland water surface area is one of the factors that determine inland water CO2 evasion [2]. Increasingly, lakes are considered to play an important role in global biogeochemical cycling:  they have been found to be an important source of atmospheric carbon dioxide and methane [2], [3], two important greenhouse gases, and also to be disproportionately important carbon sinks via carbon burial in lake sediments [4].

It is not only the number of lakes which cover the Earth’s surface that is of global importance, but also their local extension. A digital map is needed to help distinguish one lake from another and also to distinguish them from other water sources such as rivers and the sea. A combined effort from the European Climate Change Initiative (CCI) has produced a global water bodies map [5] in form of an ‘image’ (Fig. 1) which specifies where there is water and to which water body it belongs. Again this exercise uses satellite data, differing from earlier comprehensive efforts which generated similar datasets, although not to the same level of detail, not from satellite data but from a variety of existing maps, data and information [6]. Clearly efforts of this kind are often limited by the spatial resolution of satellite data and the impossibility of a systematic classification of all inland water due to their sheer number.

2016 12 08 Laura Careea Fig 1 (817 x 708)

Figure 1. Extract of the global water-body map. The area is around Lake Winnipeg in Canada. Each pixel belonging to the same lake has the same color. The colour white corresponds to ‘land’ and the black colour to ‘other inland water’. Each of the other colours corresponds to a specific classified lake (source [5]).

The correct classification of an open water surface or a mixed vegetation area is key to the study of lake processes using satellite data. Satellite technology represents a powerful tool for assessing global lake characteristics that are important for studying other processes that occur in lakes.

Lake ecosystems and the biodiversity they support are important components of the global biosphere. But their stability is threatened by climate change and anthropogenic disturbances [7].

2016 12 08 Laura Careea Fig 2

Figure 2. Lake Tanganyika from Envisat [source] where the effect of warming has been studied and documented [14].

The University of Reading, together with other institutions in UK, is attempting to measure and explain responses of lakes to environmental drivers at a global scale with the aid of satellite technology within the NERC funded Globolakes project.

Lakes are fragile systems that are sensitive to many pressures such as nutrient enrichment, climate change and hydrological modification, making them important ‘sentinels’ of environmental perturbation. According to the Globolakes experts, evidence suggests that climate change might increase the spread of harmful cyanobacterial blooms [8], [9], being one of many possible negative feedbacks of a changing climate. Many studies have shown that the lake surface water temperature (LSWT) and the timing of spring phytoplankton blooms are related to meteorological signals. Generally, lakes are able to spatially and temporally integrate and amplify meteorological signals so that they act as useful sentinels of climate change [10].

LSWT is considered an important parameter reflecting stratification [11] and mixing which are some of the main physical processes occurring in lakes [7], [10]. Changes in temperature and, in turn, stratification influence the ecosystem directly (through differential population responses) and indirectly (via dynamic effects on nutrient distribution). Lake Tanganyika (Figure 2) is an example where warming effects have reduced the exchange rates between shallow and deep water (mixing), showing that warming has influenced the ecosystem [15].

However, the current knowledge of global thermal lake behaviour is incomplete and past studies have reported temperature trends from either in situ or satellite data. Recently a big effort in analysing global LSWT trends from both in-situ and satellite data has been published [13]. However, the results are for summer and for the lake centres only.

Regarding the availability of satellite data, the ARCLake project has generated accurate and consistent spatially resolved LSWT time series for more than 250 large lakes globally from 1991 to 2010 [12], however some of the small and shallow lakes, which may respond differently to climate change [16], have not been included in the lake choice [13].

Within the Globolakes project, a set of 1000 lakes have been selected [14] in order to have a collection of water bodies that span a wide range of ecological settings and characteristics but that are also suitable for remote sensing methods.

The University of Reading is contributing to the Globolakes project generating accurate and consistent LSWT time series of high spatial resolution for the selected 1000 lakes.


[1] Verpoorter C., Kutser T., Seekell D.A., Tranvik L.J. (2014) A global inventory of lakes based on high-resolution satellite imagery, Geophys. Res. Lett., 40, 517–521

[2] Raymond, P. A., et al. (2013) Global carbon dioxide emissions from inland waters, Nature, 503, 355–359

[3] Bastviken D., Tranvik L. J., Downing J. A., Crill P. M., Enrich-Prast A. (2011) Freshwater methane emissions offset the continental carbon sink, Science, 331, 50

[4] Dean W.E.W., Gorham E. (1998), Magnitude and significance of carbon burial in lakes, reservoirs, and peatlands, Geology, 26, 535–538

[5] Carrea L., Embury O., and Merchant C.J.  (2015) Datasets related to inland water for limnology and remote sensing applications: distance-to-land, distance-to-water, water-body identifier and lake-centre co-ordinates, Geoscience Data Journal, 2(2), 83-97

[6] Lehner B., Döll P. (2004) Development and validation of a global database of lakes, reservoirs and wetlands, J. Hydrol., 296, 1–22

[7] Schmid M., Hunziker S., Wüest A. (2104) Lake surface temperatures in a changing climate: a global sensitivity analysis, Climate Change, 124, 310-315

[8] Pearl H.W., Huisman J. (2008) Blooms like it hot. Science, 320:57-58

[9] Groetsch, P.M.M., Simis, S.G.H., Eleveld, Marieke A.; Peters, Steef W. M. Cyanobacterial bloom detection based on coherence between ferrybox observations, Journal of Marine Systems, 140, 50-58

[10] Adrian R., O’Reilly C.M., Zagarese H. et al (2009) Lakes as sentinels of climate change. Limnol Oceanogr 54, 2283–2297

[11] Woolway R.I., Maberly S.C., Jones I.D., Feuchtmayr H., A novel method for estimating the onset of thermal stratification in lakes from surface water measurements, Water Resource Research, 50, 5131-5140

[12] MacCallum, S. N., and C. J. Merchant (2012) Surface water temperature observations of large lakes by optimal estimation, Can J. Remote Sens., 38, 25–45

[13] O’Reilly C.M., et al (2015) Rapid and highly variable warming of lake surface waters around the globe, Geophys. Res. Lett., 42, 10773–10781

[14] Politi E., MacCallum S., et al. (2016) Selection of a network of large lakes and reservoirs suitable for global environmental change analysis using Earth Observation, International Journal of Remote Sensing, 37, 3042-3060

[15] Verburg P.,Hecky R.E., Kling H., (2003) Ecological consequences of a century of warming in Lake Tanganyika, Science 301, 505–507

[16] Winslow, L. A., Read J. S., Hansen G. J. A., and Hanson P. C. (2015) Small lakes show muted climate change signal in deepwater temperatures, Geophys. Res. Lett., 42, 355–361,

Posted in earth observation, Hydrology, land use, Remote sensing | Tagged | Leave a comment

How TAMSAT have been supporting African people for over 35 years

By Ross Maidment

The University of Reading’s TAMSAT group ( ) have helped pioneer the use of satellite imagery in rainfall estimation across Africa since the early 1980s when the group was first established. Thanks to some bright and innovative minds back in the day, it was quickly realised that the availability of frequent thermal infrared satellite images providing full coverage of the African continent (e.g. Figure 1) could readily be used to observe the cold tops of rain-bearing convective cloud systems, and in turn, produce much needed information on where and how much rainfall has likely fallen. Given the high dependence on rainfall for many socially and economically important activities across sub-Saharan Africa (namely those in agriculture) and the severe lack of raingauges across much of the continent, this satellite-based alternative proved an extremely useful resource to monitor rainfall conditions and forewarn to impending water shocks, such as drought or flooding.

2016 12 01 Ross Maidment - Figure_1

Figure 1. Meteosat thermal infrared image from 29 November 2016, 1800 UTC (Source: White scenes denote cold cloud tops, while black/dark grey scenes denote the warmer land or sea surface.

More than three and a half decades on, the TAMSAT group still provide operational rainfall estimates (e.g. Figure 2) to all of Africa using a simple, yet effective rainfall estimation algorithm which is based on the use of cold cloud duration (CCD) fields derived from the satellite imagery. Over the years, many African meteorological services and other agencies or organisations have depended on this data for a range of applications such as flood and drought monitoring, famine early warning and weather index-based insurance schemes (e.g. Figure 3), helping millions of people to manage the highly variable rainfall climate that characterises much of the African continent.

2016 12 01 Ross Maidment - Figure_2

Figure 2. The TAMSAT seasonal rainfall anomaly for September-November 2015 with respect to the 1983-2012 climatology (Source: The seasonal anomaly, derived from 10-day total rainfall estimates, shows the impact of the 2015 El Niño event on African rainfall (namely, a wetter East Africa and drier Southern Africa).


2016 12 01 Ross Maidment - Figure_3 (752 x 450)

Figure 3. A weather index-based insurance workshop in Zambia where the use of TAMSAT data in insurance products are being discussed (Photo credit: Agrotosh Mookerjee).

In an era where there is an increasing amount of satellite-derived rainfall products, many of which are based on new and highly sophisticated sensors, it is reasonable to assume that the TAMSAT data, given it is based on a relatively simple method, may no longer be as useful. However, the strength of the TAMSAT data lies in its longevity and consistent estimation algorithm. For many applications, a short rainfall time-series makes it very difficult to assess the severity of unexpected changes in rainfall when the average or climatological conditions are not well known. The long time-series of the TAMSAT archive (since 1983) and its operational system ensures that new estimates are created on a regular basis, making the TAMSAT rainfall dataset a highly valuable resource, for both climate-based operational activities (e.g. Black et al 2016) and climate research (e.g. Maidment et al 2015).

So what next for TAMSAT? During the last 12 months, the TAMSAT group have invested huge efforts into overhauling their estimation algorithm, and in doing so, minimising several of the characteristic problems associated with the previous algorithm. In addition, and with the help of collaborators at the International Research Institute at Columbia University, the group have developed novel techniques to provide uncertainty rainfall estimates (which surprisingly many rainfall datasets do not issue) and merging of auxiliary information (such as raingauge measurements) with the satellite data to improve the estimation of rainfall intensities over short time periods. It is planned that these products will be run operationally alongside TAMSAT’s primary rainfall product during 2017, and also in a rainfall monitoring platform currently deployed in several countries across West and East Africa.

The activities described here, amongst others, ensures that TAMSAT are delivering both skillful and reliable rainfall products that are much needed in a region of the world that is deprived of adequate rainfall information, and at the same time, is challenged by marked climate variability and human-induced climate change which is expected to exacerbate current conditions in the near future.


Black, E., E. Tarnavsky, R. Maidment, H. Greatrex, A. Mookerjee, T. Quaife, and M. Brown, 2016: The Use of Remotely Sensed Rainfall for Managing Drought Risk: A Case Study of Weather Index Insurance in Zambia. Remote Sens., 8, 342.

Maidment, R. I., R. P. Allan, and E. Black, 2015: Recent observed and simulated changes in precipitation over Africa. Geophys. Res. Lett., 42, 2015GL065765.

Posted in Africa, Climate, drought, earth observation, Remote sensing | Leave a comment

Flying through the Indian monsoon

By Andy Turner

Forecasting the monsoon in India continues to be a challenge for scientists, both for the season ahead and long into the future, the monsoon being vital for 80% of the country’s annual rainfall and securing the food supply for more than a billion people.

For years scientists have studied climate models from all over the world to understand why they don’t represent the monsoon well. But new observations are needed to help understand the processes involved, including how the land surface affects the timing of the onset and cessation of the rains, and how the deep convective clouds of the tropics work together with the monsoon winds that cover a much larger region.

2016 11 25 Andy Turner - monsoon_seasons (329 x 792)Figure 1. The summer months feature much stronger rains than winter over India (coloured blue over the land) as winds blow onshore from the Indian Ocean. Also shown are sea-surface temperatures. Notice the heavy rainfall on the west coast, caused by the influence of the Western Ghats mountains.

So this year, with the support of funding from the Natural Environment Research Council in the UK and India’s Ministry of Earth Sciences, that is what we started with our INCOMPASS project.

University of Reading and the Indian Institute of Science in Bengaluru (Bangalore) led a team of more than 10 universities, research institutes and operational forecasting centres to observe the monsoon in India from the air and on the ground.

After years of planning, this May we flew the NERC-owned Atmospheric Research Aircraft, a BAe-146 four-engine jet operated by the NCAS Facility for Airborne Atmospheric Measurement to India, as part of one of the largest observational campaigns that NERC has ever funded.

We based ourselves in two locations: the northern city of Lucknow, central to some of India’s most fertile land in the basin of the river Ganges, and in Bengaluru in the southern peninsular, equidistant between the heavy rains of the Western Ghats mountains and the drier land on the east coast. The aircraft returned to the UK in mid-July.

In the north we captured the monsoon onset: taking measurements as the monsoon progresses from east to west over the plains. How do the transitions between wet and dry soils affect the development of monsoon storms? How do the temperature and humidity change as we approach the Thar Desert in north-western India? By analysing our results over the next few years, we’ll find out.

In the south we gave more attention to the heavy rainfall over the Western Ghats mountains. How does this rainfall change during the day? What does the atmospheric boundary layer look like to the west of the mountains, over the Arabian Sea from where the moisture originates? Just as in the north, and as explained in our Planet Earth article, several of our flights took place only a few hundred feet above the ground, giving a spectacular, if bumpy, view of the landscape.

A research aircraft has the advantage of being able to cover large distances and measure large weather systems from different angles. But fuel and supporting the aircraft is expensive and requires a huge commitment of engineers & flight operations and instrument scientists at FAAM and the Met Office. To accompany these measurements, the INCOMPASS project also put in place a series of flux towers (installed by colleagues at CEH) at locations across India (Figure 2).

2016 11 25 Andy Turner - kanpur_flux_tower

Figure 2. Towers like this one installed at IIT Kanpur by INCOMPASS colleagues at CEH will offer long-term measurements of fluxes of heat and moisture passed from the surface to the atmosphere. Photo (c) Andy Turner 2016.

Flux towers are vital for measuring turbulent fluxes of heat and moisture as they migrate from the surface to the atmosphere. The information collected in 2016 and hopefully for years to come will help develop and improve the JULES land model, a vital component of the weather and climate models used by the Met Office. INCOMPASS also gave us the opportunity to use the Department of Meteorology radiosonde equipment. We based this at IIT Kanpur in northern India, close to our Lucknow airport base, and launched more than 100 balloons during the monsoon rains of July.

2016 11 25 Andy Turner - balloon_launch (1068 x 712)

Figure 3. Scientists in India prepare to launch a radiosonde (weather balloon). Photo (c) Andy Turner 2016.

But what happens next? The team of scientists and students will spend the next few years analysing the vast datasets collected. With the Met Office and the NCAS Computational Modelling Services, we will be running model experiments at a variety of high resolutions to compare with our data for the 2016 monsoon. During the field campaign, we ran forecasts for India at 4 km resolution, but in the work to come we hope to perform tests below the kilometre scale to see how well we can re-forecast some of the storms in the monsoon.



Posted in Climate, Climate modelling, Monsoons, Numerical modelling | Leave a comment

From kilobytes to petabytes for global change research: take the skills survey!

By Vicky Lucas
Institute for Environmental Analytics

If you deal with megabytes of environmental sample data, or gigabytes of sensor data, or terabytes of model data or petabytes of remote sensing data, then I’d like you to take a survey.  If you create, look after, analyse, publish on or manage datasets for global change then I’d like to find out what are the necessary and emerging skills you need.

Global change research and development are pursuits that monitor, analyse and simulate every aspect of environmental developments, from climate to biodiversity to geochemistry to the human attitude and actions on the world.  For global change research to flourish, a range of skills are necessary, increasingly so in the broad area of data intensive digital skills and interdisciplinary work.

2016 11 18 Vicky Lucas Fig 1

Through this survey I would like to find out about existing training programmes and opportunities that you use and value, as well as the skills that are essential in your everyday work or that would make you more efficient or more effective.  Go to survey (closes 22 November).

I work for the Belmont Forum, which is a group of the world’s major and emerging funders, from the National Science Foundation in the USA, to FAPESP in Brazil to the Japan Science and Technology Agency, and everywhere in between.  The Belmont Forum aims to accelerate the delivery of research.  As part of the e-Infrastructure and Data Management project, I am focussed on capacity building, improving the workforce skills and knowledge to enable global change research to thrive.

If you, from anywhere in the world, have wrestled a spreadsheet, frowned at R or Python, filled your hard disk or delighted as you kicked off a month-long model run, then I’d really appreciate 10 minutes of your time to generate a few kilobytes of survey data of my own.

Vicky Lucas
Human Dimensions Champion, Belmont Forum e-Infrastructures and Data Management Project

Background on the Belmont Forum

2016 11 18 Vicky Lucas Fig 2.jpg

The Belmont Forum is a group of the world’s major and emerging funders of global environmental change research. It aims to accelerate delivery of the environmental research needed to remove critical barriers to sustainability by aligning and mobilizing international resources. It pursues the goals set in the Belmont Challenge by adding value to existing national investments and supporting international partnerships in interdisciplinary and trans-disciplinary scientific endeavours. You can also read about the full Belmont Challenge.

Belmont Forum Data Skills and Training Survey:

Posted in Academia, Climate, Climate change, Climate modelling, Numerical modelling, Remote sensing, Teaching & Learning | Tagged | Leave a comment

When meteorology altered the course of history (or maybe not)

By Bob Plant

The Battle of Milvian Bridge was fought on 28 October in the year 312 CE. The atmospheric conditions there on that day may have had a critical influence on the course of human history ever since. It’s a defensible opinion. Or they may not have been all that important: that’s a defensible opinion too. On the other hand, perhaps nothing in the least interesting happened, at least nothing of a meteorological nature. Again, that’s entirely plausible. This is a very longstanding and very much ongoing controversy. I’ll try to explain it …

2016 11 10 Bob Plant - Figure 1

Figure 1. A painting of the Battle of Milvian Bridge by Guilio Romano

The Roman empire by the third century had become difficult to control, with civil war becoming increasingly common and internal conflicts becoming increasingly destructive. The emperor Diocletian had tried to stabilize matters by formally dividing the running of the Eastern and Western halves of the empire, each run by its own emperor and junior emperor. This worked fairly well; the frontiers were strengthened and the tax system better organized. However, Diocletian’s abdication due to illness in 305 precipitated yet another succession struggle and civil war.

The ruthless chancer who eventually emerged victorious from this particular mess was the emperor Constantine, who began his bid in 306 in York and completed it by 324. Along the way, the battle of the Milvian Bridge in 312 was fought on the outskirts of Rome against the army of Maxentius, himself a recent usurper but supposedly recognized by Constantine as his superior and the emperor for the Western half of the empire. We’ll come back to the battle in a moment but first we should emphasize why Constantine’s victory matters. He went on to found the city of Constantinople (modern Istanbul) and shifted the imperial capital there. This was important in cementing the division into the eastern and western empires which did so much to shape the last two millennia of European and east-Asian history. But even more far-reaching was that he instigated Christianity as the state religion. This proceeded piecemeal, starting by relaxing and removing the persecutions and proscriptions of the latter part of Diocletian’s time (e.g. the Edict of Milan in 313) but ultimately establishing distinct legal and political advantages for Christians. The consequences of those changes have been enormous. 

It’s not clear whether Constantine’s actions on religion and the state were motivated by his own political calculations, by his genuine religious convictions or by some scrambled mixture of the two. If you were to take a guess anywhere along that spectrum it would not take long to find reputable historians making strongly-expressed arguments in support of it. Nonetheless, he was baptized on his death bed, and consistently professed to be a believer well before that, so it is safe to assume that he was at least a partial convert. An important but deeply controversial question for historians is when and how that conversion happened. And that brings us back to Milvian Bridge.

On the night of the battle Constantine apparently experienced a miraculous dream and around noon on the day itself a miraculous vision. A dream is somewhat difficult to verify or falsify of course, and not really our interest here. The vision was of “a cross of light in the heavens, above the sun, and bearing the inscription, CONQUER BY THIS”. The vision has been considered by many as being pivotal in his conversion, and in helping him to inspire the troops to victory on the day. What are we to make of this? 

We should note where this account actually comes from. It appears in the writings of Eusebius around two decades later, and apparently his source is that he was told so “long afterwards” by none other than the emperor himself. Now, Eusebius is not exactly considered the most reliable of writers on various matters for various reasons, and it is not difficult to find reasons to be cautious. The event is conspicuously absent in an account by Lactantius, for example, despite the fact that there would have been potentially thousands of witnesses to it associated with a full-scale battle on the edge of a major city. On the other hand, why invent something if there are potentially thousands of witnesses who might contradict it? Indeed many historians since, however credulous about miracles, have accepted that there may just be something in it: i.e., supposing that there may have been some natural atmospheric phenomenon, just possibly viewed with, shall we say … a little licence. This line of argument goes back several hundred years itself and is far from settled. To give a flavour of these sober and dispassionate scholarly debates, here is a quote from a very lengthy footnote in Potter’s (2013) biography of the emperor, “For further support of Weiss’s view [claiming a solar halo] see Barnes (2011), though I should note that refusal to accept Weiss’s view does not necessarily indicate an attachment to the Nazi party as is implied in his discussion.”

The meteorological explanation usually put forwards in modern historical articles is that it may have been a “sun dog”. Figure 2  is a very typical picture put forwards to support the notion, taken from the wikipedia page about the battle:

2016 11 10 Bob Plant - Figure 2

Figure 2. Two ‘sun dogs’ (parhelia). Source: Wikipedia.

Perhaps it does look like a plausible explanation, allowing for, shall we say … a little licence. But let’s be a little more careful. If we try searching online for pictures of sun dogs we can easily find many beautiful photographs, but we see relatively little that might resemble the description.

What is a sun dog anyway? A highly-recommended guide to atmospheric optical phenomena is provided by the website. A sun dog, or parhelion, is rather common and occurs when light is scattered by ice crystals in the shape of hexagonal plates that are suitably oriented, with the hexagonal faces aligned close to the horizontal. For a more vertically-extended display, or a “tall sundog”, it is helpful if the plates are not quite horizontal but are wobbling slightly as they fall. However, too much wobble, more than a degree or so, and the halo is lost. Less common, but a little bit more like the description, is a sun pillar which requires the ice crystals to be consistently somewhat tilted and a low angle of the sun. A rare event would be to combine both upper and lower pillars with a substantial horizontally-extended halo such as a parhelic arc which arises from reflections from the vertical faces of the crystal – creating something which can indeed look like a cross. This is not easily achieved, however, and it may be worth adding that Rome around noon in late October is a most unlikely time and place to be able to catch it. To get a sense of just how delicate the conditions would need to be, there is a fun halo simulator available from atoptics that you can play around with and see if you can manage to generate something like the image.

There are entire books and countless articles on this event from historians. And for that matter there are entire books on atmospheric optics. If you really want to develop an informed opinion, you have a lot of reading to do! I’ve simply tried to give a short introduction as a non-expert for non-experts. But I thought it may be interesting for the meteorologically-minded to know something of how and why the possible appearance of an atmospheric optical phenomenon has been such a hotly-debated question.

Posted in Atmospheric optics | Tagged , , | Leave a comment

The value of future observations

By Alison Fowler

The atmosphere and oceans are being routinely observed by a myriad of instruments. These instruments are positioned on board orbiting satellites, aircraft and ships, surface weather stations, and even balloons.  The information collected by these instruments can be used to ensure that modelled weather forecasts adhere to reality using a process known as data assimilation.

2016 11 03 Alison Fowler blog Fig 1

Figure 1:  Data coverage of the AMSU-A instrument, on board 6 different satellites, within a 6 hour time window (copyright ECMWF)

For the observations to be useful it is necessary that:

  • The observations can be compared to the forecast variables (e.g. temperature, humidity and wind)
  • We know the uncertainty in those observations
  • We know the uncertainty in the weather forecast model itself (so we know how much to trust the forecast vs how much to trust the observations)

These fundamentals of data assimilation are continually evolving, as the weather models become more sophisticated and are addressing new societal needs, new instruments are developed and computational resources and mathematical techniques advance.

These different aspects of data assimilation were addressed at the fifth annual international symposium on data assimilation held at the University of Reading during a very hot week in July 2016. This symposium brought together 200 scientists from 15 countries spread across four different continents and received sponsorship from NCEO, the Met Office and ECMWF.

2016 11 03 Alison Fowler blog Fig 2 (800 x 533)

Figure 2: Participants of the Fifth annual international symposium on data assimilation (photograph copyright (C) Stephen Burt).

This symposium comprised of 10 different sessions, one of which focused on the particular problem of assessing the value of observations. This is important not only for evaluating which (of the very many) observations are most important for providing an accurate weather forecast but also for designing instruments which are able further to reduce the uncertainty in the forecast. This latter problem is particularly difficult due to the fast pace at which data assimilation systems are changing, which means that by the time the instrument is operational  (possibly in a few decades time) its value may be very different than if the data could be assimilated today.

There are many possible metrics for assessing the value of observations. Some are based on how sensitive the forecast skill is to the value of the observations, others try to quantify the amount of information in the observations for reducing the uncertainty in our knowledge of the current state of the atmosphere. Computing these metrics before the instrument is built and the data is available relies on accurate estimates of the error characteristics of the instrument and its relationship to the model variables and, hence, is very challenging.

It is clearly difficult to describe the value of a future observation unequivocally by a single figure. Instead we need to provide insight, through on-going research, as to how the value of observations are sensitive to changes in the ever evolving data assimilation system. There will be much to discuss at the next symposium!

Posted in Climate, earth observation, Measurements and instrumentation, Numerical modelling, Remote sensing | Tagged | Leave a comment