Skip to 0 minutes and 1 secondThe secret to better weather forecasts. Data assimilation has been key to improving weather forecasts over the last 20 years. And yet it's not a well-known technique. The weather forecasts that we use every day are a three-dimensional computer model of the atmosphere. This simulated atmosphere is described by computer code and obeys the laws of physics, stepping forward in time to move virtual air around a virtual globe and predict future weather. In the world of improving weather forecasts, the better the starting conditions are described, the better the forecast. It is this 'initial value' problem which data assimilation tackles. As weather forecasts are run, they tend to drift from reality.

Skip to 0 minutes and 47 secondsToday's forecast for tomorrow should be more accurate than today's forecast for next week. The main way to solve this drift is to run a new forecast. Starting the new forecast includes the most recent observations, which adjusts the initial conditions back nearer to reality. We improve the next forecast by resetting the initial values. There are millions of new observations included each time a weather forecast is run. The observations come from a range of sources, including buoys, airports, aircraft, weather radar, - and satellites are vital for global coverage, especially over vast and mostly data-sparse oceans. The role of data assimilation is to combine the observations in the weather forecast model in a way that optimises the information from both.

Skip to 1 minute and 37 secondsData assimilation is a mathematical technique to provide the weather model with the best starting values for the next time it runs. The weather model will have a starting point which it had from the previous time it ran. But that solution isn't perfect. It wasn't perfect in the first place, and over just a few hours has diverged further from reality. Data assimilation inserts the new observations into the model, which it does carefully, mathematically persuading the model to accept the new view of the world provided by those current observations. Both the model and the observations have some imprecision. And real life observations of the weather are quite sparse across the world.

Skip to 2 minutes and 17 secondsThe skill of data assimilation is to optimise a combination of the two. And this combination has to happen in a short time to get on with the business of running the weather model itself and getting the forecast out promptly.

Skip to 2 minutes and 33 secondsUncovering the hidden talents of data assimilation is an area of research and commercial interest. Data assimilation is a general technique to optimise blending observations into a physical model, a model which is a three- or four-dimensional representation of a real system and may be anything from the weather to an oil field, managing traffic flow, or guiding autonomous vehicles. Data assimilation is the diplomacy and persuasion behind weather forecasts, encouraging the best solution that can be obtained by making the observations and the model work together.

Data assimilation

We’ve talked about the sources of big data and in Step 1.4 we touched on the needs of weather forecasting, the minute-by-minute big data needs of gathering data and producing forecasts. A technique which is fundamental to weather forecasting, but is becoming widespread in many areas of environmental modelling – is data assimilation.

Data assimilation improves predictions from large and complex forecasting models, by combining uncertain model predictions with a diverse set of observational data in a dynamic feedback loop. Data assimilation is a formidable big data problem. We need to integrate massive datasets efficiently and accurately, which is limited by computing power. As the animation describes, the aim of data assimilation is to combine real-world observations with a numerical model, which can’t be done in one step but requires iterations of small steps. The more powerful the computer, the faster the steps can be completed and the more complete the convergence of observations and model data. For routine weather forecasts, several times per day, supercomputers are required both to assimilate around 10 million observations and run a new forecast for the coming days; the data assimilation and forecast run, take just a couple of hours.

Find out more about a University of Reading project, DARE, which will produce a step-change in the skill of forecasts of urban natural hazards such as floods, snow, ice and heat stress. For a broader perspective on data assimilation research and training opportunities visit the Data Assimilation Research Centre (DARC) pages.

Share this video:

This video is from the free online course:

Big Data and the Environment

University of Reading