High speed mathematics: reducing the computation time for weather forecasting

By Sarah Dance

Several times a day, around 10 million observations of the atmosphere are processed by operational weather services, in order to produce the next weather forecast. At the University of Reading, we have been using mathematics to understand and control the amount of computer-time taken in the forecasting process.

Why?
In numerical weather prediction, heterogeneous observations are weighted according to their uncertainty, to create our best estimate of the current state (winds, pressures, temperatures, moisture) across the globe.  This process is called data assimilation. A computer model then solves equations based on physical laws, to calculate the forecast from a few minutes to several days ahead.  The amount of computer-time taken in the data assimilation process is very important: a weather forecast that arrives after the weather has already happened is pretty useless!

How?
The data assimilation process uses weighting matrices, describing our knowledge of the uncertainty in the observations.  We have shown how the sensitivity of the data assimilation solution, and the speed of the computer code in finding that solution, depends on the mathematical properties of the weighting matrices.

What now?
Observation uncertainty cannot be measured and must be estimated in statistical sense.  However, these estimated matrices may be noisy, and require “cleaning up” before they can be used practically.  Our results could be used to inform this clean-up process and, in turn, reduce the computational time taken for data assimilation.

Reference
Tabeart, J. M., Dance, S. L., Haben, S., Lawless, A., Nichols, N. and Waller, J., 2018. The conditioning of least squares problems in variational data  assimilation. Numerical Linear Algebra with Applications. (In Press)

This entry was posted in Climate. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *