ESS Project FY95 Annual Report: Applications Software

Development of an Earth System Model (ESM): Atmosphere/Ocean Dynamics and Tracer Chemistry

Objective: The goal of this project is to develop a state-of-the-art model that describes the coupled global atmosphere-global ocean system, including chemical tracers that are found in, and may be exchanged between, the atmosphere and the oceans. This ESM will be applied to problems of climate change. The ESM code will be highly parallelized both within and among components, have the ability to archive data via a Database Management System (DBMS), and be linked to a system that allows for the remote visualization of model output in real time. The ultimate goal of the project is to have an ESM capable of performing ensembles of century-long simulations to study the impact on climate of increasing concentrations of greenhouse gases in a wall-clock time comparable to that required by the atmospheric general circulation model (AGCM) alone.

Approach: There are four major elements in the ESM: an AGCM, an oceanic general circulation model (OGCM), an atmospheric chemical tracer model (ACTM), and an oceanic chemical tracer model (OCTM). Parallel versions of each of the AGCM, OGCM, and ACTM have been developed. The individual codes are being coupled with emphasis on parallelism. There is also a version of the software for on-line distributed application management and data visualization.

Accomplishments: Improved the peak performance of the AGCM code on a CRAY T3D (256 processors) to 2.6 GFLOPS, with the AGCM/Physics part of the code executing at 3.4 GFLOPS. The peak performance obtained using 144 processors of an IBM SP-2 is approximately the same. The parallel version of the GATOR/ACTM code has achieved a preliminary performance of 0.54 GFLOPS using 32 processors of an IBM SP-2. Communication interface routines have been developed to allow for coupling between the AGCM, OGCM, and ACTM. Development of an OCTM that uses the techniques of the GATOR (Gas, Aerosol, Transport, and Radiation Chemistry) ACTM code in an ocean model environment is nearing completion. A working prototype of a system to archive ESM output via a "data broker" in a standard format and to catalogue descriptive information about the output in an object relational database management system has been completed. Using this prototype, we demonstrated for the first time simultaneous archival storage from a running AGCM directly into a database (February 1995). A performance model that predicts the number of wall-clock seconds required to simulate one day as a function of model parameters, architecture parameters, data layout, and algorithm was designed and used to identify potential bottlenecks, choose among design alternatives, and tune performance.

Demonstrated that improper simulation of the stratus clouds observed off the coast of Peru is a major cause of uncoupled, atmospheric-oceanic GCM's flaws. Preliminary results from a detailed intercomparison of simulations with the coupled GCMs at Princeton and at UCLA, which share the same OGCM, indicated that there is an apparent lack of agreement between the two simulations and a proxy of observations (National Meteorological Center) in the equatorial currents. Tests of the multiple-cloud-base version of the Arakawa-Schubert parameterization were completed. Preparations are underway to test inside the GCM a version of the convection parameterization that uses the fractional entrainment rate as the cloud type parameter, instead of the cloud top level.

Significance: Computer simulations using ESM's address fundamental issues that affect the environment and epitomize the challenge of Earth sciences to computer technology. These models can be used to study nonlinear interactions and feedbacks between different components of atmosphere and ocean circulations. Examples of problems that may be addressed with ESM's are El Nino-Southern Oscillation events, the role of the oceans in moderating the greenhouse warming effect of carbon dioxide and other gases, and the behavior of the ozone layer.

Status/Plans: Current optimization efforts for the AGCM code center on the high-latitude filtering in the AGCM/Dynamics portion of the code. We are implementing a scheme to balance the load associated with this filtering evenly among the processors. When this is complete, we will address load imbalances in the AGCM/Physics part and single node optimization. We plan to improve the performance of the chemistry model components on the SP-2 by improving the load balance of the ODE solver. We are investigating the code's performance on other architectures (such as the CRAY T3D). In addition, we are continuing work on coupling GATOR/ACTM to the other components of the ESM, beginning with the AGCM and DBMS. We have determined that the specification language for SUN RPC would suffice as a data description language for the message passing capabilities of the standard software (i.e., PVM or MPI) used in for scientific applications on MPP's. We have constructed a stub compiler, which takes a standard description of these messages and generates a sequence of PVM calls to both encode and decode messages.


Figure 1 Caption: Potential vorticity in the middle stratosphere (approximately 30 km altitude) from a simulation of a major stratospheric sudden warming using a stratospheric version of the UCLA parallel AGCM code.

Figure 2 Caption: January mean sea level pressure (millibars) from a simulation with the parallel version of the UCLA AGCM using envelope orography and a new gravity wave drag parameterization.

MPEG Movie (561 Kbytes) Caption: This movie shows the variation in computation times for the atmospheric chemistry at different locations on the globe for a 8 x 4 decomposition of the GATOR/ACTM code on the IBM SP-2. Each column represents the time required by one processor to complete a single model timestep--the sequence covers a period of several simulated days. The load is heaviest in summer and in areas of daylight.


Investigator Progress Metric


Points of Contact:

Prof. Carlos R. Mechoso
Dr. John D. Farrara
Department of Atmospheric Sciences
University of California, Los Angeles
mechoso@atmos.ucla.edu, 310-825-3057
farrara@atmos.ucla.edu, 310-825-9205

Prof. James Demmel
Computer Science Division and Dept. of Mathematics
University of California, Berkeley
demmel@cs.berkeley.edu
510 643-3586
URL: http://www.cs.berkeley.edu/~demmel


Table of Contents | Section Contents -- Applications Software | Subsection Contents -- Grand Challenge Investigator Teams