Development of High Resolution Global NWP Model

Ken-ichi Kuma

Numerical Prediction Division

Japan Meteorological Agency

1) Issues to be solved

The current JMA global model has the resolution of T213 which corresponds to about 60km mesh, while the regional model has the resolution of 20km mesh. In order to provide the global scale to regional scale forecast with a unified model, we need to develop the very high resolution global model( about 20km mesh). To realize such high resolution global model, there are several issues to be solved as follows;

1 Semi-Lagrangian time integration scheme to improve computational efficiency.

2 Assessment of computational requirement in high resolution global spectral model.

3 High resolution global data such as topography.

4 Physical processes which can work properly with 20km mesh.

2) Current development

At Japan Meteorological Agency, the development of Semi-Lagrangian global model is under way. We have completed the shallow water version of the Semi-Lagrangian model and now expanding to 3D model. It is expected that Semi-Lagrangian model will be implemented into the operation NWP model in early 2003. In 2006, it is expected to introduce the 8th generation super computer by which we can run TL999 (Truncation at total wave number 999 with linear grid) which corresponds to 20km (or less) mesh resolution.

Our Semi-Lagrangian model will still follow the spectral formulation. Thus, the cost for Legendre transformation and data communication among computational nodes for wave-grid transformation must be assessed under the real computational environment. Since we cannot run TL999 on the current JMA computing system, we are to use T426 resolution with 40 nodes and compare the performance with the operational T213 resolution.

The experiment is conducted on Hitachi's SR8000E, which has 80 nodes with distributed memory (8 Gbytes each). Each node has a peak performance of 9.6GFLOPS with 8 RISC-based CPUs. Fig.1 shows how computational cost increases as the horizontal resolution increases. It should be noted that T213 is computed with 16 computational nodes, while T319 and T426 are computed with 40 nodes. T426 model takes 9 times larger total computational time than T213 model. Considering that the doubled horizontal resolution requires 2X2X2=8 times computation, the overhead due to Legendre and communication is satisfactory small at T426. It is also noted that the actual computational speed is about 16 % of the theoretical peak performance in T213 model even if we include the time for I/O (about 2-3% of total time). This computational performance is notably higher than that for the other RISC based machine.

Fig.2 shows ratio for execution time in major components of the model. As the horizontal resolution increases, the ratio of physical process (PHYSCS) is reduced while that of dynamical process (TNDNCY) increases. It should be noted that the moist process (GMOIST) includes both moist physical processes and wave-grid transformation. TINTGS includes the process related to semi-implicit time integration which is computed on wave space.

High resolution global data includes those used for the initial conditions and those used for the boundary conditions. As to the former category, we need to collect the satellite data with the original resolution. For instance, ATOVS (Advanced TIROS Operational Vertical Sounder) data has 17.4km mesh resolution which needs to be used for 20km mesh model. One of the most important boundary condition data is the global topography. For the future high resolution model, we have replaced Navy's 10 minites data by GTOPO's 30 seconds data.

Fig.3 shows the topography and precipitation in 24 hour for T213 and T426 models. The difference of precipitation over the ocean is very small, suggesting the robustness of cumulus parameterization over the different resolution. On the other hand, over the southern slope of Tibetan plateau, the precipitation pattern differs significantly. It is of interest to investigate whether this difference gives the different general circulation through a feedback mechanism. We need to run the high resolution model for long time range, which is feasible in the Earth Simulator.

The performance of the physical processes in the higher resolution is not verified yet. However, our experience in the regional model (about 20km mesh resolution) implies the possible problems. Our regional NWP model often exaggerates the intensification for the extra-tropical cyclone. It is not clear that either the high resolution model or physical processes in the regional model causes the problem. In order to separate two issues, we are planning to implement the physical processes of the global model into the regional model.

We have also increased the number of model vertical layers from 30 to 40 and lifted the model top from 10hPa to 0.4hPa. One of the problems arises from this change is the strong wind at the top level which reduces the time interval of integration, thus increases the computation. Fig.4 shows the histogram of occurrences for each vertical level to determine the CFL condition. More than half of the cases, the highest level determines the CFL condition. It is partly explained by the fact that the wind speed in the upper stratosphere is high. However, in the winter for the southern hemisphere, wind speed exceeds more than 250m/s, which seems to be unrealistic.

Fig.5 shows the monthly mean temperature at 1hPa in July obtained from 11 month integration of T106L40 model. The temperature in the polar night is unrealistically low (190K) and close to the radiative equilibrium temperature. This implies the lack of some dynamical processes in the model in the polar night stratosphere.

3) Summary

JMA is planning to develop the very high resolution global model which corresponds to 20km mesh model. The feasibility of the global model with such high resolution must be examined in various aspects. We have already developed the high performance global model. This model needs to be run intensively under the high resolution environment.