Better research in flash flooding urgently needed for ASEAN countries

download By Dr. Albert Chen (University of Exeter)
16th February 2017


The FFIR researcher Dr Albert Chen from the Centre for Water Systems, University of Exeter, was invited by the APEC Climate Center (APCC) to present at the APCC-ASEAN Disaster Management Symposium.

The event was held on 9-10 February 2017 in Jakarta, Indonesia, aiming to encourage the dialogues between scientists and practitioners that will bridge the gap between science and policy in disaster risk reduction and management. Over 50 delegates from 14 countries, mostly government officials, attended the symposium and shared their knowledge with each other.

Dr Chen shared the work in the ongoing NERC FFIR programme and discussed potential future research to help policy makers. The audience identified that flash flooding as a key area where better science and technology are desperately needed to support decision makings in hazard mitigation. Research outcomes from FFIR programme will benefit ASEAN countries in building the capacity of flood forecasting that consequently will enhance early warning and reduce flood damage.

The challenges of using “big data” in Numerical Weather Prediction: meteorological observations from air traffic control reports.

sarah By Dr. Sarah Dance (University of Reading)
19th December 20146

 Many would say that Numerical Weather Prediction has been using “big data” for decades. Routine forecasts are produced using computational models with billions of variables, and tens of millions of observations, several times a day. Most of these observations come from scientifically designed observing networks, such as satellite instruments, weather radar and carefully sited weather stations. However, urban areas also present rich sources of data, that to date have not been fully explored or exploited (e.g., citizen science, smartphones, internet of things etc.), and could provide significant benefits when forecasting on small scales, at low cost.

In surface scientific networks, point observations are often sited away from buildings, in locations that are intended to be more broadly representative of larger areas and not designed to reflect local urban conditions. These observations lend themselves more naturally to comparison with discretized models, whose grid-lengths may be much larger than the size of a building. For datasets of opportunity, a key problem is to understand the effects of the urban environment on the observations so that uncertainties can be properly attributed and proper quality control procedures established. Furthermore there are complex issues surrounding use of the data, such as personal privacy and data ownership that must be overcome.



For the rest of this article we focus one dataset of opportunity arising from air traffic control radar reports. Mode Selective Enhanced Surveillance (Mode-S EHS) is used by Air Traffic Management to retrieve routine reports on an aircraft’s state vector at a high temporal frequency (every 4 to 12 seconds). The state vector consists of true airspeed, magnetic-heading, ground-speed, ground-heading, altitude and Mach number. Mode-S EHS reports can be used to derive estimates of the ambient air temperature and horizontal wind at the aircraft’s location. These derived observations have the potential to give weather information on fine spatial and temporal scales, especially in the vicinity of airports, where there are millions of reports per day. For example high-frequency reporting of vertical profiles of temperature and wind may provide extra information for use in numerical weather prediction that would have particular value in the forecasting of hazardous weather.  While some of the problems of understanding and using datasets of opportunity are circumvented (the effects of buildings are less relevant to flying aircraft), all measurements during aircraft turns and other manoeuvres have to be discarded. Furthermore, the reports are transmitted in small data packets, with limited precision, with the result that the uncertainty in the derived meteorological observations is very large, particularly at lower altitudes. For more information see

Mirza, A. K., Ballard, S. P., Dance, S. L., Maisey, P., Rooney, G. G. and Stone, E. K. (2016), Comparison of aircraft-derived observations with in situ research aircraft measurements. Q.J.R. Meteorol. Soc., 142: 2949–2967. doi:10.1002/qj.2864


laura_baker By Dr. Rob Thompson (University of Reading)
25th November 2016

This week we held a conference for the FFIR programme, with meetings for projects SINATRA and FRANC and the kick off meeting for TENDERLY, all with lots of discussion. During the meeting, we took a short aside on Wednesday afternoon to launch a weather balloon and for me to give a tour of the University of Reading Atmospheric Observatory. The observatory is home to many instruments, a great deal of which have their data displayed live on the website linked above. While I gave the tour of the many instruments located at the observatory, and to discuss them would deserve a blog of it’s own, I’ll talk today about our radiosonde launch, and the fascinating profile it sent back.


First, what is a radiosonde? Well the radiosonde is actually the small box of instrumentation that is on the long string below the weather balloon. The “sonde” measures temperature, humidity, pressure and GPS (to tell us about the winds), they also have a port to add other sensors (such as ozone, turbulence and electrical charge).  The package is sent into the atmosphere by helium balloon, they can reach as high as 40km, though this only only made it to 16.8km, still well into the stratosphere. We arrived as the balloon was nearly fully inflated, and ready for launch after just a few minutes, expertly done by our technicians, especially the experienced hands of Ian Read.


The walk over from the Meadow suite was rather nice, we were fortunate that is wasn’t particularly cold (about 9C), a bit of sun and not much wind… and interestingly, a selection of cloud levels, at least three were clear and I was suspicious that there were in fact 2 levels of lower cumulus clouds, though it was hard to tell by eye. The launch went off without a hitch and we could watch her ascend – Chris Skinner tweeted my favourite video.

We watched the sonde for a few minutes and then began the tour of the observatory, before we got to observe the data coming in live, at this point we could already see the two (I was right!) low cumulus cloud layers and the very dry air from the anticyclone to the north of us, as seen in the synoptic chart.



We then returned to the Meadow Suite to continue the meeting, having had a break and some much needed fresh air. The poster session and programme advisory board overlapped, so while the posters were viewed I received the full data from Ian and could process the data and then hand draw the data in a tephigram. Tephigrams (T-phi gram, it’s a skewed graph of temperature against potential temperature) are an excellent way to present profiles through the atmosphere, they look horribly complicated, but with 2 lines on a 2d chart, a huge amount of information is delivered, and complicated maths can be done just by following lines on the plots… an amazing invention. I did some basic analysis and that’s what you see here.


So I was right, 4 layers of cloud, and a dry slot from the anticyclone that has descended to 800hPa from about 500hPa, becoming very dry (6% RH) from that descent. It really is a fascinating case, with 3 distinct temperature inversions and more apparent changes of air mass too. Just as a final plot, here’s the cloud radar vertical view from Chilbolton, it’s ~50km South-West of Reading, but had very similar conditions.


You can see here that there were high clouds during the morning that were descending and thinning, by 13:00, that cloud was just a thin layer and likely becoming patchy, but there are thin higher clouds at about 8km seen both earlier and later, likely what we saw as the high cirrus, that also appears on the ascent. Chilbolton seems not to have the low clouds Reading did, though they are present at 17:00, so perhaps that simply shows they were not overhead.

Overall, it was a fascinating time to launch the sonde and I had several people thank me for the tour and launch, I hope everyone enjoyed it and the change of scenery from the meting too.

How can we predict the future when we don’t fully understand the past?

archer By David Archer  (Visiting Fellow Newcastle university and JBA Trust)
27th October 2016

Over the last four years, I have been compiling chronologies of flash floods and the associated causes from intense rainfall, associated occurrence of hail, and results in terms of drowning, deaths by lightning, destruction of houses and bridges, erosion of hillsides and valleys and flooding of property. The main focus of SINATRA has been on Northeast England, Cumbria and Southwest England but chronologies are now almost complete for Lancashire and Yorkshire; an additional less comprehensive chronology has been prepared for the rest of Britain. The source material has been mainly the online British Newspaper archive, with its 15 million searchable pages, but a wide range of documentary sources has also been used. Given the rapid growth of published newspapers in the mid nineteenth century, the records can be considered comprehensive since at least 1850.

In compiling this chronology, event by event, I was struck by the variability of occurrence by year and by decade, which did not fit with the concept of more intense rainfall in a world warming with climate change (Kendon et al. 2014). The most frequent and really damaging flash floods tended to concentrate in the late nineteenth and early twentieth centuries and there were fewer events in many of the later decades of the twentieth century. Figure 1 shows the decadal chronology for Northeast and Southwest England (Archer et al, 2016)


Figure 1 Time series of flash floods by decade from 1800 to 2010 divided by severity for (a) Northeast England and (b) Southwest England (insets show mapped areas covered by time series).

My first reaction to these findings was: Can I explain them away? Are these patterns of change the result of variable reporting of such events or have they been the result of changing catchment conditions? In response to the first, I am convinced that, except for WWII when such reporting was prohibited, such severe events would be reported and described in the press. With respect to catchment changes, the assessment of the relative magnitude of historical pluvial floods is the most problematic. Urban growth has increased impermeable area (likely to increase flood risk) but sub surface drainage has been improved (likely to decrease flood risk). However, in extreme events such as described, where the rainfall intensity is far in excess of the design capacity of drainage systems, sewers are surcharged and surface flows exceeded gulley capacity in both historical and recent events. A fuller discussion can be found in Archer et al (2016).

The chronology has also assembled a time series showing the decadal variability of large hail in Southwest England and Northeast England (Fig. 2) which shows a similar time distribution to flash floods. It is probable that the less frequent reporting in recent decades of hail causing serious breakage of glass is due to the increased glass strength of standard glass panes but the decline in other reports of large hail must reflect a real decline in occurrence. A similar pattern is reported for the whole of England with decadal declines from a maximum around the turn of the 19th/20th century and a minimum occurrence in the 1970s (Webb et al. 2009).


Figure 2 Number of occurrences of large hail with and without reported extensive glass breakage for Southwest and Northeast England.

Chronologies of historical flash floods and occurrence of large hail for Northeast and Southwest England indicate strong natural variability, with the second half of the twentieth century showing the lowest frequency of such events. Unless we can explain the sources of such variability and incorporate them in models to project future incidence we run the risk is of serious underestimation even without the expected increase in risk due to rising temperatures.

Figure 2 Number of occurrences of large hail with and without reported extensive glass breakage for Southwest and Northeast England.


Archer (in press) Hail – historical evidence for influence on flooding, Circulation

Archer, D.R., Parkin, G. and Fowler, H.J. ( In press 2016) Assessing long term flash flooding frequency using historical information, Hydrology Research. doi: 10.2166/nh.2016.031

Kendon, E.J., Roberts, N.M., Fowler, H.J., Roberts, M.J., Chan, S.C. and Senior, C.A. (2014) Heavier summer downpours with climate change revealed by weather forecast resolution model, Nature Climate Change  4, 570–576 doi:10.1038/nclimate2258.

Webb, J.D.C., Elsom, D.M. and Meaden, G.T. (2009) Severe hailstorms in Britain and Ireland, a climatological survey and hazard assessment, Atmospheric Research 93,  587–606.

National flood modelling integration workshop held in Morpeth, Sept 2016

GeoffParkin By Dr Geoff Parkin  (Newcastle University)
17th October 2016

A workshop on modelling flooding from intense rainfall with participants from NERC Franc, Sinatra (and Tenderly) projects as well as local stakeholders with interests in flood risk assessment and response was held in Morpeth, Northumberland on 20-21 Sept 2016. Morpeth has had a long history of flooding, with large events in 1963 following snowmelt, and in 2008 when 1000 properties were affected by a 1:137 year event with a peak flow of 360 m3/s.

The aim of the workshop was to develop an integrated modelling strategy to demonstrate end-to-end forecasting capabilities for a single location, including assessment of different modelling approaches for catchment and urban flood modelling, sensitivity to theoretical patterns of convective and frontal storm event movement of river/stream flows and inundation in the Wansbeck catchment, local tributaries, and town centre, and effects of flooding from multiple sources.


An informative field trip was held on the first day, with attendees inspecting the £27M Morpeth flood alleviation scheme, including new and improved flood barriers in the town, the upstream storage reservoir dam and culverts, and ‘log-catcher’ poles which are designed to prevent impacts of woody debris on infrastructure in the town. This was followed by a visit to the contrasting Dyke Head site in the upper catchment, where a set of Natural Flood Management features have been installed demonstrating an alternative low-cost approach to reducing flood risk.


The main workshop discussions were held on the second day, in the ‘Glass Room’ of the Waterford Lodge Hotel in Morpeth. A structured modelling strategy was agreed, informed by approaches used in the Environment Agency/JBA’s Real-Time Flood Impact Mapping Project. Models used and developed within the research projects and industry-standard models used by consultancies are being applied at the full Wansbeck catchment scale, and at very high resolution in urban areas. Simulations are first being run for the 2008 flood event, with comparison against flood depths reconstructed using crowd-sourced information. We will then assess model performance in simulating flooding from multiple sources (fluvial and pluvial) from hypothetical extreme events with different spatial positioning over the area. Evidence from recent floods in Morpeth support wider understanding that flooding from rivers and from localised rainfall both have significant impact, but their combined effects (e.g. when high river levels restrict discharge from storm drain overflows) can be locally complex. The expected outcomes from the study will be improved understanding of capabilities of models used in flood response in the UK for simulating catchment and urban processes, specifically with respect to end-to-end modelling of flooding from multiple sources.


The afternoon session focussed on understanding more about the needs of communities and organisations for real-time flood risk information, as the first activity in Work Task 3.2 of the Tenderly project. Representatives of first responder organisations (Environment Agency, Northumbrian Water, Northumberland County Council) and flood-affected communities (Morpeth Flood Action Group, Northumberland Community Flood Partnership) provided a range of interesting perspectives on how information is used in during periods leading up to and during flood events. In the Tenderly project, this will help to inform how to make better use of methods developed in Franc and Sinatra and of all sources of information, including improved forecasts of convective as well as frontal rainfall, real-time flood modelling outputs, and crowd-sourced information.

Geoff Parkin, Newcastle University

September Hot Spell Storms To An End

laura_baker By Dr. Rob Thompson (University of Reading)
22nd September 2016

Last week, Reading was hit by a very memorable storm in the early evening of Thursday the 15th. Then over night, in the early hours of the 16th, we were hit by another. Both storms were full of lightning, hail, and flash flooding, the perfect candidates for a article on a flooding from intense rainfall blog you might be thinking… me too.

I’ll start with some local facts, figures and reports on the weather that happened, while in Reading the storms were spectacular, the rainfall wasn’t massively high (29.6mm fell between the two storms), however rainfall was recorded as 72.2mm in Maidenhead (about ~20km East) – the second heaviest daily fall in the area since 1942. Hail was reported in Sindlesham (~5km South-East of Reading) to be 3.5cm, my south Reading home saw hail about 2cm in diameter (I suspect the largest I’ve seen in the UK). The storms marked the end of an unseasonably warm spell in the South-East, the temperature during the storm plummeted, dropping 8 degrees C in just 30 minutes. The first storm triggered ahead of the cold front moving from the south, that front being the cause of the rather longer 2nd storm. There was a lot of lightning on both storms, reaching 60 per minute over the UK area at about 4am, as the map below shows:

Probably unsurprisingly for such an intense pair of storms, the impacts were felt by the people of the Reading area. Lightning struck a house in Caversham (North-Reading), blowing a hole in the roof. But the biggest impact was flash flooding, which affected a number of roads in the area, this one of Vastern Road (which passes under the railway) shows that it filled with water causing traffic problems near one of the bridges across the Thames, reports are it shortly became impassable for a while.

The next picture is of the Sainsbury’s car park in Winnersh (very near Sindlesham) which flooded, likely much of this was melted hail!

The met office “WOW” (Weather Observations Website, which uses crowd sourced data) says there was “Minor internal property damage and/or minor external damage to property or infrastructure.” caused by hazards “Flood, Hail” in the Winnersh area, though details of this are unknown to me at the time I write.

The storms were quite different, the first very localised, the second much more widespread, but still very variable spatially. We seem to have been lucky the impacts were not more serious.

SINATRA Researcher hacked the GloFAS

download By Dr. Albert Chen (University of Exeter)
21st January 2016

The FFIR-SINATRA Researcher Dr Albert Chen at the Centre for Water Systems (CWS), University of Exeter, has participated in FloodHack and won the First Winner Prize.

The FloodHack was a hackathon event on Global Flood Awareness System (GloFAS) held on 16 and 17 January 2016 at the European Centre for Medium-Range Weather Forecasts (ECMWF) in Reading, UK. It was organised by the ECMWF, supported by the Copernicus Emergency Management Service Program, to look for innovative solutions to overcome the existing challenges that GloFAS is facing. 40+ participants with a wide range of backgrounds (education, computer science, physics, hydrology, geography, etc.) attended the FloodHack event and formed five teams to develop the solutions.

pic1(Photo courtesy: Silke Zollinger)

Dr Chen teamed up with three Software Developers, Miss Laura Ludwinski, Mr Sam Griffiths and Mr Paul Barnard from JBA Consulting, the Physicist Dr Peter Watson from Oxford University, and the Hydrologist Dr Li-Pen Wang from KU Leuven.

They developed the software LIVE (Logistic and Infrastructure Visual Evaluation) to summarise the detailed flood forecasting information from the GloFAS into a ‘Time to respond’ map that allows the decision makers having better understanding of available time to act for flood mitigation. The LIVE can also help prioritise the resources allocation so the areas with the most urgent flood threat will receive immediate attention.

pic2(Photo courtesy: Florian Rathgeber)

The objective and the workflow were determined in the first round of group discussion. Therefore, each member of the team LIVE contributed individual’s skills and knowledge to complete the subtasks, including information collection and extraction, data processing and analysis, and visualisation. Python scripts and the QGIS were the main tools the team LIVE adopted to develop the solution. After 27 hours of intense collaboration and countless cups of coffee and Redbull, a prototype of LIVE was completed.

The outcome of each team was presented to all participants in the afternoon of the second day, and judged by a panel consisted of professional software developer, telecoms expert, environment and technology consultant, web technologist, and crisis manager. The judging criteria included (1) potential for innovation (2) relevance/usefulness (3) technical merit (4) design/user (5) experience/polish and (6) “wow” factor.

Five projects were presented, including

  • FloodIT to offer refined flood information based on the GloFAS to help local users understand their situation.
  • The (flooded) Italian Job that analysed big data to determine spatial varied flood warning thresholds for the GloFAS.
  • LIVE provided “Time to respond” maps to help emergency management.
  • Interception that adopted the GloFAS as an educational platform to raise flood awareness.
  • GloFAQ to identify infrastructures based on the GloFAS that are at risk of flooding.

The panel was impressed by team LIVE and their excellent application of the GloFAS data that can potentially benefit global stakeholders with different needs. The technology was also ready to make further applications achievable. As a result, the team LIVE was announced as the First Winner of the FloodHack.

pic3(Photo courtesy: Florian Rathgeber)

Dr Chen thanks the talented teammates who successfully implemented the LIVE software. His knowledge gained from the FFIR-SINATRA project has been proven as valuable inputs to the team for developing the application. The FloodHack experience will also help the FFIR team to integrate the FRANC and SINATRA in the next stage of research.


Designing Convection-permitting Ensemble Forecasts

DF_AZ By David Flack  (University of Reading)
18th January 2016

In my previous blog I talked about the different convective regimes that flash floods occur in, and was in the early stages of my PhD. My work has moved on a fair amount since then and I have started to look into ensemble forecasts of convective events. I have spent a fair amount of my PhD working out a design for a convection-permitting ensemble, so I thought I’d write a bit about the process to help show the uncertainties we currently face in predicting thunderstorms.

Now, ensembles (and their uses) have been covered a fair amount in this blog, as have advances in forecasting in which it was mentioned that probabilistic forecasts could be made from well-spread ensembles that take into account the true uncertainties in a forecast. But one of the key questions in convection-permitting ensemble research groups is how do we represent the uncertainties?

This question has many answers, I suggest here a couple of ways in which we could look at the uncertainties, but before I do I will give a brief reminder of what an ensemble is and how it works. Traditionally a model would be run with one realisation in a deterministic fashion, and that would be the forecast. However, is we were to nudge (perturb) the starting conditions, model physics or boundary conditions (or all three) of the run we could (provided the ensemble is well-spread) create equally likely outcomes and hence a probabilistic forecast of whether or not is would rain tomorrow (Fig. 1).


Fig.1: Deterministic and ensemble forecast, dark red crosses show the starting and ending positions of the forecast and the bright red cross shows the truth. The dotted lines show the path of the forecasts and the red circles indicate the range of starting positions and possible forecasts.

So how can we try to take into account the uncertainties? Well to start to answer that we need to know what could be uncertain about a forecast, three things come to mind immediately – the location, the timing and the intensity of rainfall.

How can we take some of these things into account? Well I mentioned earlier we could change the initial and boundary conditions of the model – this could be done by a process of time lagging, by which we look at previous forecasts and create an ensemble member over times that they all cover (Fig. 2) this may give an idea into when the convection could actually occur, and may even go some way to changing the position and or intensity of the event.


Fig. 2: Time-Lagged ensemble schematic. For a forecast initiated at 00, 01 and 02 GMT we can create an ensemble for 2 – 4 GMT based on the data points shown between the two black lines.

We could also take the option of tweaking the model physics slightly, we could do this by adding a field of random numbers every so often into model and run these new numbers through the model, we could use different parameterisations or using aspects of the behaviour of the event we are trying to forecast, like add stochastic noise into a process that is stochastic in nature.

These suggestions are just a couple of ways of taking into account uncertainty. I am by no means claiming these are the best ways or the only ways, and they certainly do not take into account the full uncertainty in the atmosphere, but at least it’s a start in the right direction. However, these types of differences do produce different realisations of the atmosphere and hence different forecasts for rainfall events, so can be used in giving us probabilistic forecasts of flash floods and other events.

The next thing to concern us though is how do we actually interpret, communicate and verify probabilistic forecasts, but this is a completely different topic which I will not cover in this blog. However, to give you a clue it takes more than one forecast to verify a probability.

Flooding from Storm Desmond in Northumberland and Cumbria

GeoffParkin By Dr Geoff Parkin  (Newcastle University)
14th December 2015

The Sinatra Flood Action Team (FloAT) has been in action following the extensive flooding from Storm Desmond and the rain that both preceded and followed it.

The headline figure was the new highest ever UK 24-hour rainfall, with 341.4mm of rainfall reported for Honister Pass in Cumbria in the period up to 1800 GMT on 5 December 2015, and a new 48-hour record of 405mm also being recorded at Thirlmere in Cumbria [1]. Many records were also broken in river levels across the north. The combination of extreme rainfall and river levels and flows has led to significant geomorphological changes, which FloAT have been recording and surveying using dGPS and terrestrial laser scanning equipment.

The storm centre tracked to the north of the British Isles, with a long weather front remaining in an almost static position across Scotland, bringing strong winds and persistent rainfall over a wide area of northern England and southern Scotland. The most intense rainfall was to the wast of the Pennines, caused by strong orographic uplift, as warm moist air was forced upwards by the hills, resulting in cooling, and condensation of moisture into droplets forming the rainfall. To the east of the Pennines the opposite effect happened, as the air mass (with reduced levels of moisture) flowed downwards, causing warming which allowed it to retain the moisture and suppressed rainfall. For Storm Desmond, this rainshadow effect was particularly noticeable, with negligible amounts of rainfall being recorded near the east coast.

In north-east England, preliminary analyses (to be confirmed) of rainfall and river flows have been made by the Environment Agency (see attached factsheet, provided with permission by the EA). These show clearly the record-breaking rainfall totals across the Pennines, with accumulations over 24 hours and above estimated to be well in excess of 1 in 100 year return periods (note that even these may be under-estimated, due to the high winds), and the rapid drop in rainfall totals towards the east.

In the Tyne catchment, river levels have exceeded the previous maxima within the periods of records for the North Tyne and the main Tyne, including at the lowest non-tidal gauging station at Bywell (61 years of record). Comparison of the peak levels against historical records (as reported by David Archer) show that levels in this event exceeded those of the 1815 flood by about 0.4m, but still did not reach those of the great flood of 1771, which caused the collapse of the original Tyne Bridge and resulted in several deaths [2]. A wider analysis of historical information (to be reported in a paper focussing on flash floods for  Hydraulics Research, currently under review) has shown the value of this type of historical evidence in understanding of flood-rich and flood-poor periods.

fig1Debris washed against the 19th century bridge across the River Tyne at Ovingham in Northumberland (Photo: Andy Russell).

There has been extensive damage to bridges across the whole area affected during this event. Along the River Tyne, the 19th century bridge at Ovingham had just been opened on the Thursday before the storm, following 18 months of refurbishment. Extensive build-up of debris against the scaffolding has caused it to be closed to traffic until its structural integrity can be checked. Debris during floods can cause structural damage to bridges through direct impact, as well as exacerbating scour of foundations.


Mixed large and small woody debris, brought downstream from riparian areas in the catchment, and left with thick deposits of silt on floodplains (Photo: Andy Russell).

In Cumbria, the extensive damage in many cities, towns and villages and the further rainfall this week has meant that the priority has been on safety and recovery, and the FloAT field activities are always careful to recognise this and to carry out the research work in a sensitive and sympathetic way. However, they have been able to visit some of the worst affected areas, and have carried out surveys of wrack lines in Keswick, and of the debris fan developed downstream of Glenridding, where further rainfall during this week has caused more movement of material even during the recovery activities.


Mixed debris blocked against a hedge barrier. This can affect flood flow directions dynamically during events, as indicated by the flattened grass (Photo: Andy Russell)

In Keswick, the River Greta overtopped recently installed flood defences upstream of the A5271 Bridge.  At this location inundation depths and sediment loads greatly exceeded those for the November 2009 floods.  In places, properties were inundated by flows depths of 1m.


Clear-up operation in Glenridding. Excavator is removing large amounts of sediment deposited upstream of the bridge in river channel and village centre (Photo: Andy Russell).

Glenridding was inundated with water and large volumes of coarse-grained sediment during Storm Desmond.  A landslide further up the catchment translated into a hyperconcentrated flow allowing further entrainment of sediment from the channel margins. The A592 bridge became choked with sediment allowing several metres of sediment aggradation in the village centre, most notably around the Lake District National Park visitor centre.  A large sediment fan was deposited downstream of the Glenridding Hotel by flows which burst from the walled main channel.


Large volumes of coarse-grained sediment deposited in the centre of Glenridding (Photo: Andy Russell).

Subsequent analysis of the data collected from this event will improve understanding of the role of debris and geomorphological change during extreme rainfall and flooding, and support our abilities both to model catchment and urban hydrology and to manage their impacts on infrastructure.

fig6Steve Birkinshaw surveying large debris fan downstream of Glenridding (Photo: Andy Russell). fig7Laser scanning the Storm Desmond debris fan at Glenridding (Photo: Andy Russell).

Written by Geoff Parkin, Newcastle University, on behalf of FloAT (Matt Perks, Andy Russell, Andy Large) and David Archer, 14 December 2015.

A New Era in Forecasting

DSC02392 By Prof. Peter Clark (University of Reading)
10th August 2015

How do we measure progress? We often talk about a ‘quantum-leap’ (ironically, the smallest change known to physics), or a ‘step-change’. Both are meant convey jumping to a new level, rather than just a gradual improvement. When the Met Office made their ‘convection-permitting’ model operational in 2009, they were making such a leap, but I prefer to think of it as entering a new era. We are at the start of a journey, not the end. We are trying to do something we have not done before. Our position on the journey to forecast reliably the convective storms responsible for intense rainfall is akin to where we were for synoptic-scale weather systems such as extra-tropical cyclones in the 1970s, when the idea of using numerical simulations of the physics running faster than reality to make predictions became a routine.

In those days, our models grossly under-resolved the flows of we needed to forecast. I have fond but frustrating memories from the early 1980s of using output from the regional ‘fine-mesh’ (75 km) model to plan research flights. Forecasting 24 hours ahead was a pretty hit and miss affair, especially if one needed the right airmass to be over the area one had clearance to fly in at the right time. The objects of interest (cyclones) were grossly under-resolved, with consequent systematic errors in timing and development. Major ‘busts’ happened, of course, such as the Great Storm of October 1987. Nevertheless, forecasts were useful. Over the last 40 or so years we’ve made steady progress and reached the point where not knowing the synoptic-scale quite accurately 2-3 days ahead is a rarity, and we often have a very good picture even further ahead.

None of this progress enabled us to predict the location and intensity of convective storms directly. It helped the forecasting process, of course, by telling us, with increasing accuracy, the broad regions where storms might occur, but no more, for the simple reason that they were not designed to do so. They were not designed to ‘resolve’ the storms. Beyond that simple statement, as a result, they were explicitly designed to prevent such storms from occurring, by recognising the atmospheric instability that produces the convective clouds and removing it, in a way designed to try to mimic the way that convective clouds remove the instability.

We often talk about model resolution in the same way as we talk about the resolution (in megapixels) of our digital cameras. This may convey some sense of what is going on, but the idea that there is some underlying ‘full-resolution reality’ over which our model averages is not a helpful way to understand how the model works. We may be trying to predict such an average, but we have nothing to average over, and so have to predict how the average will change in the future knowing only our best estimate of the average now and without any knowledge of the complex flows, such as thunderstorms, happening within our ‘pixels’. This process is known as ‘parametrization’.

A major consequence of this is that our models cannot always behave correctly. In particular, they fail to recognise the interaction between storm flows that can lead to organised and relatively long-lived systems of intense rainfall called ‘mesoscale convective systems’ or MCSs. These represent one of nature’s finest examples of order growing out of chaos.

For example, a small but intense MCS formed from isolated convective showers during the late afternoon and evening of 13th June 2014, propagating slowly south and finally crossing the south coast at about 0500 UTC (Fig. 1). By midnight it had formed an organised system shown in Fig.  1. This shows the radar-derived rainfall rate compared with forecasts of rainfall rate from the convection-permitting UKV (1.5 km) model and the convection–parametrizing Global (25 km) model. The latter, rather than forming an intense organised system, indicates convective showers which die out as the solar heating wanes.



Figure 1: Radar-derived rainfall from 13-14th June 2014 showing the evolution of an MCS over central Southern England.

So, when we made the decision to design weather forecast models that actually simulate thunderstorms, we entered a new era. This era has many similarities with the early days of numerical weather prediction (NWP). The objects of interest (thunderstorms) are grossly under-resolved, with consequent systematic errors in timing and development (Lean et al., 2008; Stein et al., 2015). Sometimes events are very poorly forecast. Nevertheless, forecasts are useful. We are beginning to learn when to rely on them and when not to.  We should hope and aim for future improvements, including those that will arise from the FRANC project. It is, perhaps, no coincidence that the first new forecast product to emerge from the new forecast system was the ‘Extreme Rainfall Alert’.


Figure 2 Mesoscale Convective System (MCS) over the UK represented by instantaneous rainfall rates (mm hr-1) for 0000 UTC 14th June 2014. (a) Radar-derived rainrates at 1 km resolution (b) UKV MetUM T+9 forecast started at 1500 UTC 13th June 2014 and (c) Global T+12 forecast started at 1200 UTC 13th June 2014.


There is, however, one difference in characteristics we have to learn to appreciate fully. The lifetime of an extra-tropical cyclone is a couple of days. That of a thunderstorm is only a couple of hours. We might realistically expect to be able to follow the lifetime of one thunderstorm in a forecast model, but only if we can detect it early enough and persuade our model to start generating a storm in the right place at the right time. The perhaps 6-hour time window we might use to detect how a cyclone is beginning to develop might be equivalent to 15 minutes to see the early stages of convective cloud.

At present we have remarkably little observational information to enable us to do this. Weather radar is our most powerful tool, and much of FRANC is devoted to making the most of the information we get from it. But most of the information we get from it relates to quite late in the lifecycle of storms. We need other information about clouds and their precursors for a realistic forecast system.

We can have little expectation of forecasting exactly what happens beyond, or even during, the lifetime of a single convective cloud; the knock-on effects of evaporation of cloud and rain, interaction with the surface etc. etc. are far too sensitive to predict exactly. (The structure of the MCS in Fig. 2 is not well-forecast 9 hours ahead). We have to make use of probabilistic forecasts, generally by means of ensembles of forecasts. The key requirement is that such ensembles accurately represent the true uncertainty in a forecast. At the same time, we will have to learn (and then teach others!) how to interpret and use what will, on the face of it, appear to be very low probabilities. There is a huge difference between predicting the probability of a 1 in 100 year event at a given place (e.g. a few streets in a town) and the probability of such an event, say, somewhere within 10 km of that place.



Lean, H. W., Clark, P., Dixon, M., Roberts, N. M., Fitch, A., Forbes, R. and Halliwell, C. (2008) Characteristics of high-resolution versions of the Met Office unified model for forecasting convection over the United Kingdom. Monthly Weather Review, 136 (9). pp. 3408-3424. ISSN 0027-0644 doi: 140.1175/ 2008MWR2332.1

Stein, T., Hogan, R., Clark, P., Halliwell, C., Hanley, K., Lean, H., Nicol, J. and Plant, B. (2015) The DYMECS project: a statistical approach for the evaluation of convective storms in high-resolution NWP models. Bulletin of the American Meteorological Society. ISSN 1520-0477 doi: 10.1175/BAMS-D-13-00279.1