PDAF - THE PARALLEL DATA ASSIMILATION FRAMEWORK: EXPERIENCES WITH KALMAN FILTERING
Data assimilation with advanced algorithms based on the Kalman filter and large-scale numerical models is computationally extremely demanding. This motivates the parallelization of the assimilation problem. As another issue, the implementation of a data assimilation system on the basis of existing numerical models is complicated by the fact that these models are typically not prepared to be used with data assimilation algorithms. To facilitate the implementation of parallel data assimilation systems, the parallel data assimilation framework PDAF has been developed. PDAF allows to combine an existing numerical model with data assimilation algorithms, like statistical filters, with minimal changes to the model code. Furthermore, PDAF supports the efficient use of parallel computers by creating a parallel data assimilation system. Here the structure and abilities of PDAF are discussed. In addition, the application of filter algorithms based on the Kalman filter is discussed. Data assimilation experiments show an excellent parallel performance of PDAF. © 2005 World Scientific Publishing Co. Pte. Ltd.
AWI Organizations > Infrastructure > Computing and Data Centre
Helmholtz Research Programs > MARCOPOLI (2004-2008) > German community ocean model