The “size” of the NWP/DA problem

By Javier Amezcua

There is a professor in the University of Reading that likes to say that the Data Assimilation (DA) problem in Numerical Weather Prediction (NWP) is larger than the size of the universe (estimated to be around 1080 atoms, give or take). I thought it would be interesting to explain what he means by that and how one arrives at such an assertion. While doing this, I will also discuss some basic mathematical concepts that are involved in the combined NWP/DA problem.

First we have to understand NWP. In 1904 Vilhem Bjerknes – one of the fathers of meteorology – stated in a seminal paper that predicting the weather is “simply” a physics problem that can be solved mathematically. After all, the atmosphere is a fluid (made of nitrogen, oxygen, argon, water vapour, etc) that sits on top of a rotating sphere (more or less) in which we happen to live. Motion in this fluid is driven by solar radiation, constrained by the rotation of the Earth (effects such as the Coriolis ‘force’ and Taylor columns come to mind), and subject to boundary conditions (the ocean and land). If we consider the variables: velocity (a 3-dimensional variable), air density, humidity, temperature and pressure, then the evolution of the atmosphere is governed by the equations in Figure 1. Although they seem complicated, these expressions come from simple basic principles: conservation of mass, energy and momentum, as well as an equation of state (perhaps this is the least intuitive one).

Figure 1: Equations governing the evolution of the variables in the atmosphere.

The problem does not seem too big so far, right? Seven equations for seven variables (remember that velocity has three components). These variables, however, depend on one time component and three space components (latitude, longitude and height), i.e. these are spatial fields. Six of them are partial differential equations, i.e. they contain derivatives both in space and time.

“Solving” the equations in Figure 1 means performing the time-integration of the seven spatial fields. This is not an easy task. Simplifications can be done (e.g. by considering limits, equilibrium cases, scale analysis) and one can reach some analytical solutions for the simplified problems. However, in the general case (which is the one used in NWP) the equations have to be solved numerically. This requires both spatial and temporal discretisation. For the spatial part, imagine the atmosphere as a hollow shell, and imagine that we divide this shell in small boxes (recall this is a three-dimensional problem). The smaller the boxes the more precise our solution is.

In a simple latitude-longitude grid, the number of boxes (called gridpoints) is

Nboxes = Nlatitudes x Nlongitudes x Nlevels.

And since in each of the boxes we have seven variables we would have the following number of effective variables:

Nvariables = Nboxes x 7.

Let’s say we discretise every degree in both longitude and latitude, and we assign 20 vertical levels. This already results in Nvariables ~ 107  (in operational centres this number is closer to 107). Suddenly we have gone from 7 fields to 107 grid variables! On top of that, these variables evolve in time.

Now, let us think of the kind of system that we are dealing with. As I have discussed in an older post in this same blog, the atmosphere belongs to a class of dynamical systems known as chaotic. In these systems, a single evolution does not really mean much after a given lead-time, since tiny perturbations in the initial conditions can render completely different trajectories after a time. The dream would be to evolve probability density functions and update them with new observations when these are available. This is the ultimate (yet perhaps unattainable) objective of DA.

As a mental exercise, let us think of constructing the pdf’s empirically (histograms) for the variables of the atmosphere at a given time. For simplicity let’s think we divide the range of each variable in 10 bins (not very high resolution, but this will do). For each bin there is a frequency count. For two variables we would have 102 bins, and it would look something like Figure 2. Can you imagine a histogram in three or more dimensions? (I cannot).

Figure 2: A 2-dimensional (bi-variate) histogram (centre) with 1-dimensional histograms in the marginal axes. (Source: Python documentation)

Now, let us go back to our problem. How many frequency counts do we need to store in order to have an empirical probabilistic view of the atmosphere? Nbins = 10Nvariables. If we go back to our calculations we would get: Nbins = 1010000000 … which is colossal! If we stored a number in each and every atom of the universe, we would still have a huge quantity of numbers without a place to be stored. This is what the original assertion refers to. A more precise way of saying it would be: “an empirical probabilistic representation of the discretised state of the atmosphere is bigger than the size of the universe”.

You may be thinking: can the problem be simplified? Perhaps. For instance, if the distribution of the variables is a joint multi-variate Gaussian (a common assumption), then this distribution is totally determined by a mean vector and a covariance matrix. They are still massive entities: the vector has  109 elements and the covariance matrix has 1018 elements. Imagine storing these elements and doing computations with them … well, this is what we do in supercomputers. But this approximation (Gaussianity) is sometimes out of its depth, so we are always looking for ways to improve our probabilistic representation.

This entry was posted in Climate, data assimilation, Numerical modelling, Weather forecasting. Bookmark the permalink.