Grand Challenge Applications and Enabling Scalable Computing Testbed in Support of High Performance Computing:
Title: Financial Year 1997 Annual Report
Agreement Number: NCCS5-150

P.M. Lyster, J.W. Larson, W. Sawyer, C.H.Q. Ding1,L.-P. Chang2, R. Ménard3, R. Rood4, S. E. Cohn4

University of Maryland Earth Systems Science Interdisciplinary Center*

Email to lys@dao.gsfc.nasa.gov
http://dao.gsfc.nasa.gov/DAO_people/lys

Additional Affiliations
1. Lawrence Berkeley National Laboratory,Berkeley, CA 94720
2. General Sciences Corporation, a subsidiary of Science Applications International Corporation
3. University of Maryland at Baltimore County, Joint Center for Earth Systems Technology
4. NASA/GSFC Data Assimilation Office
* Changed name from: Joint Center for Earth Systems Science

Abstract:

This is the second annual report to the testbed that has arisen out of the Cooperative Agreement(NCCS5-150): Grand Challenge Applications and Enabling Scalable Computing Testbed(s) in Support of High Performance Computing. This covers the period October 1, 1996 to September 30, 1997, and should be read as a follow on to the annual report for 1996. This is a collaborative project that has involved scientists from the University of Maryland Earth Systems Science Interdisciplinary Center (ESSIC), the NASA/Goddard Space Flight Center Data Assimilation Office (DAO), Jet Propulsion Laboratory, and Lawrence Berkeley National Laboratory.


Index

1. Overview of Goddard Earth Observing System (GEOS)
2. Scientific Accomplishments
3. Technology Accomplishments
4. Community Contributions
5. Status/Plans
6. Point of contact
7. Caption of Graphic
8. List publications
9. List of presentations
10. List of other media
11. Training
Appendix A: Summary of GEOS-3 Core System
Appendix B: Web References
General References


1. Overview of Goddard Earth Observing System (GEOS)

The primary objectives of this Grand Challenge HPCC project are to facilitate the migration of NASA's atmospheric Four Dimensional Data Assimilation (4DDA) software to scalable parallel systems and to serve as discerning developers and testers of modern parallel computer technology. Previous presentations of this project have concentrated on the Core computing algorithm -- the model and the analysis -- of 4DDA [e.g., Lyster et al. 1997e]. It is useful to broaden the context and now describe the complete environment in which data assimilation is being carried out at the NASA Data Assimilation Office (DAO).

Atmospheric data assimilation produces accurate gridded datasets that are used by meteorologists for weather analysis and forecasts; it is also being used as a tool for climate research. The DAO conducts reanalysis of archived earth-science datasets as well as real-time mission support analysis and forecasts for the climate research community. There are various definitions of data assimilation; the present project broadly describes 4DDA as the combination of a range of observations with physically consistent model forecasts to produce a best estimate of the state of the atmosphere. The DAO's Goddard Earth Observing System (GEOS) Data Assimilation System (DAS) is described in the Algorithm Theoretical Basis Document [DAO, 1996]. The DAO is preparing to move GEOS DAS to distributed-memory parallel computing platforms. The new system will be designated GEOS-3. This will be used for DAO's normal activities, and it will be an important for of the Mission to Planet Earth (MTPE) Enterprise in the coming years. It is also an important environment for studying new parallel computing technology under HPCC.

The end-to-end GEOS DAS is shown schematically in Figure 1. The present DAS, whose Core system (described in the Appendix A ) runs on multitasking Cray C90 and J90 computers, is fed from the Global Telecommunication System (GTS). Meteorological data, including satellite-retrieved temperature profiles, are given to the Goddard Distributed Active Archive Center (DAAC). These data proceed through a data reduction and preprocessing stage before being ingested by the Core system. In the future, new datatypes will be provided via the EOS Data and Information System (EOSDIS) Core System (ECS). Postprocessing stages include: Quality Assurance of Data Sets (QuADS) inspects the gridded data sets produced by the Core system for I/O error and general physical consistency; Adaptive Tuning of Error Statistics System (ATESS) produces observation and forecast error statistics for the GEOS DAS; and DAO On-Line Monitoring System (DOLMS) monitors observation and gridded data streams and makes them available for graphical presentation. There are a number of different modes of operation for the DAS. Briefly, mission support involves real-time, or First-Look, data assimilation and sometimes the production of up to 10-day model forecasts. For current data sets made available from the Goddard DAAC, this mode of operation ingests about 50 megabytes of data per day into the Core system. In the coming year, satellite-retrieved profiles of atmospheric parameters will be produced as part of the DAS preprocessing system, which will increase the data ingest rate to about 1 gigabyte per day. The output analysis (gridded) datasets are about 1 gigabyte per day in real-time mode, while the production of model-forecast fields can increase this quantity by over an order of magnitude. Periodically, the DAO conducts reanalysis projects that involve multi-year analysis whose data sets are then studied and distributed to the climate research community. In this mode of operation, the DAO plans for a production rate of 30 days of assimilation per wall-clock day.

The end-to-end GEOS DAS is a complex operation that is being updated using modern Process Engineering methods [Pressman, 1993]. There are about 100 people directly involved in DAO research and operations. This need for Process Engineering has become apparent in a complex environment where issues such as software Configuration Management (CM), version control and maintenance, multiple hardware platforms, and the coordination of distributed personnel are important. The DAO enthusiastically embraced Process Engineering and CM after it became apparent that the organization had reached critical mass in terms of personnel, technology, and management. A measure of this is the baselining of a formal System Development Plan (SDP) under the control of the DAO Scientific Steering Board (SSB) and Configuration Control Board (CCB). The key documents for this are internal (DAO Intranet), however the GEOS-3 Software Plan (section 4 of the SDP) is referenced here. This describes the structure of the DAO software effort, a brief outline of GEOS-3 timelines, and testing methodology. DAO is also subject to a yearly review by its external computing panel [Farrell et al., 1996, 1997].

As part of the DAO's preparation to provide support to MTPE in 1998, and as part of a Grand Challenge project that is being funded by the NASA High Performance Computing and Communications (HPCC) Earth and Space Sciences (ESS) program, a Core system that is based on the Message Passing Interface (MPI) parallel library is being developed using a Fortran 90 modular approach -- the Goddard Earth Modeling System (GEMS) [DAO, 1997]. Key components of Core GEOS DAS, including the GCM (in collaboration with Max Suarez at GSFC) and an early version PSAS, have been parallelized [Ding and Ferraro, 1995] [Ding and Ferraro, 1996] (in this report, we shall refer to this parallel version of the early PSAS as the prototype parallel PSAS). These form the basis for the design of their respective parts of the parallel system. The Core system is being optimized and run on the SGI Origin 2000 systems at Ames Research Center as part of the DAO's research and operations, and the Cray T3E testbed computer at GSFC as part of the HPCC program.
Further technical discussion on the parallel Core system can be found in the article that will appear in Supercomputing97 by Lyster et al. (1997e)

  

figure41

Figure 1: The GEOS End-to-end Data Assimilation System.

2. Scientific Accomplishments

The Kalman filter that was developed in the First round HPCC investigation has undergone extensive scientific validation and considerable insight in the behavior of this method has been gained [Ménard et al. 1997]

Peter Lyster has carried out a study of the representativeness error for the Kalman filter [Lyster, 1997f]. This is the initial phase of scientific validation of the Lagrangian filter method. Kevin Olson has provided the team with an accurate two-dimensional (latitude-longitude) trajectory code. Considerable research was performed to check the state of the art in spherical trajectory algorithms, and to incorporate modern methods in Kevin's code. This code will be the underlying transport algorithm for the Lagrangian filter.

3. Technology Accomplishments

HPCC team members contributed significantly to the adoption of modern software methods at the DAO. Peter Lyster is the software manager of GEOS-3, and HPCC members have contributed to the Process Engineering at the DAO. This has involved attending classes [Pressman, 1993], writing documents for the DAO Intranet, and designing the parallel Core system. Considerable in house expertise has been gained in the design of modular algorithms using Fortran 90.

HPCC members have contributed to the following GEOS-3 documents. These are all referenced and a summary of useful web references is also given at the end of this document.

HPCC team members contributed to the DAO fast test'' that was instrumental in the purchase of the O2K system (160 nodes). These and other related work are summarized in the DAO Benchmark Page. In the coming years these computers at NASA Ames research laboratory will be used for production. At first, a multitasking version of GEOS-3 will be used in support of the EOS satellite AM-1 launch in June 1998. By the end of 1998 a full MPI version of the GEOS-3 will be validated and in production.

HPCC team members contributed to the DAO's presentations at the external computing panel reviews [Farrell et al., 1996, 1997].

Starting from the Held-Suarez prototype parallel DYCORE plus simple physics, a complete GCMbuilt, including full physics has been built and is undergoing testing. In this context, testing refers the verification that functional requirements of the code are met, and that the basic scientific properties of the GCM are correct. After this, validation is required to guarantee scientific correctness of the code for the production system. The validation step for parallel (MPI) GEOS-3 will not be completed until the middle of 1998.

As the DAO focuses more on stratospheric data assimilation for climate research, the GCM will be run at higher altitude. Research has shown that algorithmic noise must be suppressed using a rotation of the prognostic fields at the interface between the dynamics and the physics packages. This could be severely restrictive in a distributed-memory algorithm because the entire field must be remapped between processors (as opposed to the regular parallelization of DYCORE where only boundary-value gridpoints are communicated to nearest neighbor processes). A parallel rotation has been prototyped by Will Sawyer. The initial optimization uses SHMEM, and results are shown in the DAO Benchmark Page

The porting of vector Cray code to both Origin and T3E platforms has forced the DAO to develop flexible software. The benefits of this are considerable: existing bugs in the codes have been unearthed; flexible I/O has been adopted; a more modular software design has been adopted.

4. Community Contributions

As part of collaboration with the HPCC project, meetings were held at Calverton offices of SGI/Cray (attendees Jim Abeles, Tom Formhals, Peter Lyster, Will Sawyer, Jay Larson, Pam Sotnick). HPCC members attended a course in T3E optimization at Calverton and contributed to discussions on parallel computing.

DAO members are active in the international community. Jay Larson and Peter Lyster gave talks at the ECMWF workshop on the Use of Parallel Processors in Meteorology in December 1996 [Lyster et al. 1996]. Considerable interchange of knowledge had been gained through the DAO's connections with ECMWF which converted its operational system to MPI on Fujitsu computers two years ago (Geerd Hoffman, who is a member of the DAO external computing panel, was chief engineer at ECMWF and was the lead on their conversion to MPI). The DAO is an associate member of the Real Applications on Parallel Systems (RAPS) consortium which is a European initiative in parallel benchmarking.

Members Jay Larson and Peter Lyster attended the Science Team Meeting in Atlanta during High Performance 97, and Larson gave a technical presentation.

Frequent HPCC lunches have been held at GSFC with Clark Mobarry and Spencer Swift sometimes in attendance.

Jay Larson and Will Sawyer contributed their reviews at an allocations meeting for the distribution of GSFC T3E resources to the wider community.

Peter Lyster held discussions with Miodrag Rancic of GSFC and Eugenia Kalnay of NCEP. The Data Design Document of GEOS-3 [Sawyer et al. 1997a] describes intra-modular interfaces. This document will enable researchers like Rancic to more easily plug' their applications (in this case, a parallel GCM) into GEOS-3.

5. Status/Plans

Considerable effort has been spent by Will Sawyer to develop the parallel GCM for GEOS-3. This is not a demonstration code, but the full GCM with DYCORE and physics packages. Old plumbing has had to be unearthed and reengineered. This process is almost complete. At the same time, key modules, such as the radiation packages, have been studied by Jim Abeles, and the knowledge (especially single pe optimization) will be transferred to the DAO with the help of Spencer Swift. The prototype GCM (DYCORE and simple physics) has been run on a 1023 pe T3E-900 in Minnesota by Jim Abeles, and this achieved 60 gigaflop/s. This code was run at 0.5o x 0.5o horizontal resolution (compared with 2o x 2.5o of the current production DAO GCM). Accounting for the reduced single pe performance of the T3E-600 at GSFC, it is expected that the performance of the high-resolution full GCM will lie between 25 and 30 gigaflop/s on the latter machine.

Considerable effort has been made to design parallel GEOS-3 so that the software of prototype parallel PSAS can be reused. This parallel decomposition is being converted to a library for use in the Fortran 90 parallel GEOS-3. The matrix decomposition and load-balancing algorithm will be treated in the same way. The prototype parallel PSAS will be used for the 50 and 100 gigaflop/s benchmarks. This has achieved 11 gigaflop/s on the GSFC T3D as part of the earlier 10 gigaflop/s milestone submission and Jay Larson has benchmarked this algorithm at 33 gigaflop/s on the GSFC T3E. Prototyping on a T3E-900 at NERSC is ongoing by Chris Ding. It is expected that the adoption of single precision BLAS library (e.g., HGEMV) will add at least another 30% to the performance.

The GCM and PSAS will achieve 25 gigaflop/s scaling on the GSFC T3E. An external machine, similar to the T3E-900 will be required to achieve the 50 and 10 gigaflop/s milestones. The assessment of the combined flop/s for the GCM and PSAS will be performed at the higher horizontal model resolution (0.5o x 0.5o) This is a synthetic test in the sense that a production or scientific DAS at this resolution will not be used for a number of years (DAO plans to go to 1o resolution in 1999). A scaling analysis that estimates the relative percentage of time used for the GCM and PSAS in this configuration will have to be performed. At current resolution, the GCM takes about 40% of the time of the Core system (with approximately 80,000 observations).

Scientific tests on the Lagrangian filter will concentrate on validation of a one-dimensional algorithm. This has already been developed with help from Clark Mobarry. In the coming year the two-dimensional trajectory code of Kevin Olsen will be incorporated into a full Lagrangian filter, and will be subject to scientific tests using the same UARS data (methane and ozone retrieved products) that was used for the Kalman filter [Ménard et al. 1997]. Optimization and timing studies will he performed to achieve the HPCC milestone for 200 times speed up over the corresponding Kalman filter on a single node of a Cray C90.

6. Point of contact

Peter Lyster
Email to
lys@dao.gsfc.nasa.gov
http://dao.gsfc.nasa.gov/DAO_people/lys

Mail: MCC1 Suite No. 200, 7501 Forbes Blvd, Seabrook, MD 20706

7. Caption of Graphic

Figure 1 is a schematic of the GEOS-3 end-to-end production system. With a multitasked Core system running on the Origin 2000 systems at Ames research center, this will be used for DAO regular mission support and for the EOS AM-1 satellite launch in June 1998.

8. List of publications

Peer Reviewed Papers

P.M. Lyster, S.E. Cohn, R. Menard, L.-P. Chang, S.-J.Lin, and R. Olsen. Parallel Implementation of a Kalman Filter for Constituent Data Assimilation. Mon. Wea. Rev., 125, 1674-1686 (1997). DAO Office Note No. 97-02, NASA Goddard Space Flight Center, Greenbelt, Maryland. Available also at http://dao.gsfc.nasa.gov/subpages/office-notes.htm.

P.M. Lyster, C.H.Q. Ding, K. Ekers, R. Ferraro, J. Guo, M. Harber, D. Lamich, J.W. Larson, R. Lucchesi, R. Rood, S. Schubert, W. Sawyer, M. Sienkiewicz, A. da Silva, J. Stobie, L.L. Takacs, R. Todling, J. Zero. Parallel Computing at NASA Data Assimilation Office (DAO), accepted for publication in Supercomputing97, (1997). Available also at http://dao.gsfc.nasa.gov/DAO_people/lys.

Non Peer Reviewed Reports

R. Ménard, L.-P. Chang, P.M. Lyster, and S.E. Cohn. Stratospheric assimilation of chemical tracer observations using a Kalman filter. Submitted to J. Geophys. Res. (1997). Available also at http://dao.gsfc.nasa.gov/subpages/office-notes.htm. DAO Office Note No. 97-14, NASA Goddard Space Flight Center, Greenbelt, Maryland.

P.M. Lyster, J.W. Larson, J. Guo, W. Sawyer, A. da Silva, and I. Stajner: Progress in the Parallel Implementation of the Physical-space Statistical Analysis System (PSAS), to be published in the proceedings of the European Center for Medium-Range Weather Forecasting workshop Making its Mark -- The Use of Parallel Processors in Meteorology, (1997).

J.W. Larson, P.M. Lyster, W. Sawyer, C.H.Q. Ding, J. Guo, A.M. da Silva, L.L. Takacs (1997). Progress in the Design and Optimization of the Parallel Goddard Data Assimilation System (DAS). In High Performance Computing 1997 Grand Challenges in Computer Simulation, A. Tetner, Ed, p. 52. The Society for Computer Simulation International, San Diego, CA, USA.

P. Lyster (June 1996): Scientific and Engineering Problems in Atmospheric Four Dimensional Data Assimilation, Draft Proceedings of PetaFlops Systems Workshops.

DAO staff . GEOS-3 Data Assimilation System Architectural Design (1997). DAO Office Note No. 97-06, NASA Goddard Space Flight Center, Greenbelt, Maryland. Available at http://dao.gsfc.nasa.gov/subpages/office-notes.htm.

P.M. Lyster, W. Sawyer, and L. Takacs. Design of the Goddard Earth Observing System (GEOS) Parallel General Circulation Model (GCM) (1997). DAO Office Note No. 97-13, NASA Goddard Space Flight Center, Greenbelt, Maryland. Available at http://dao.gsfc.nasa.gov/subpages/office-notes.htm.

P.M. Lyster, J.W. Larson, C.H.Q. Ding, J.Guo, W. Sawyer, A. da Silva, and I. Stajner. Design of the Goddard Earth Observing System (GEOS) Parallel Physical-space Statistical Analysis System (PSAS) (1997). DAO Office Note No. 97-05, NASA Goddard Space Flight Center, Greenbelt, Maryland. Available at http://dao.gsfc.nasa.gov/subpages/office-notes.htm.

P.M. Lyster. Four Dimensional Data Assimilation: HPCC 10 Gigaflop/s Milestone Submission (1997). Available at http://dao.gsfc.nasa.gov/DAO_people/lys.

W. Sawyer, A. da Silva, R. Lucchesi, P. Lyster, and L.L. Takacs. GEOS-3 Data Assimilation System Core Interface and Data Design (1997). DAO Intranet.

W. Sawyer, L.L. Takacs, A. da Silva, and P.M. Lyster. Parallel Grid Manipulations in Earth Science Calculations (1997). Submitted to Computational Aerosciences in the 2lst Century (CAS21), January 1998.

W. Sawyer, and A. da Silva. ProTeX: A Sample Fortran 90 Source Code Documentation System. DAO Office Note No. 97-11, NASA Goddard Space Flight Center, Greenbelt, Maryland. Available at http://dao.gsfc.nasa.gov/subpages/office-notes.htm.

Document in Preparation

P.M. Lyster, S.E. Cohn, R. Ménard, and L.-P. Chang. A Domain Decomposition for Error Covariance Matrices Based on a Latitude-Longitude Grid, to be submitted to Computer Physics Communications.

9. List of presentations

P.M. Lyster. Progress in the Development of the Parallel Goddard Laboratory for Atmospheres Data Assimilation System (GEOS DAS), NASA Ames Research Laboratory, Moffett Field, CA, January 1997.

J. Larson. Progress in the Design and Optimization of the Parallel Goddard Data Assimilation System (DAS). Proceedings of High Performance Computing 97, Atlanta, GA, March 1997.

W. Sawyer. Parallel Grid Manipulations in Earth Science Calculations. Centre for Advanced and Parallel Applications. Swiss Federal Institute of Technology, Lausanne, Switzerland, Aug. 20, 1997.

W. Sawyer. Parallel Grid Manipulations in Earth Science Calculations. Centre for Advanced and Parallel Applications. Swiss Center for Scientific Computing, Manno, Switzerland, Sep. 3, 1997.

P.M. Lyster. Development of a Lagrangian Filter for Constituent Assimilation. Third Workshop on Adjoint Applications in Dynamic Meteorology, Bishop's University, Lennoxville, Canada, June 18, 1997.

10. List of other media

Peter Lyster, Richard Ménard, and L-P. Chang have worked with Jarrett Cohen of the High Performance Computing Branch and Stephen Maher of the Scientific Visualization Studio at GSFC to develop a 3D video of the Kalman filter assimilation of retrieved methane profiles from the Upper Atmosphere Research Satellite. This is the fist visualization of 3D stratospheric dynamics, and in the context of 4DDA provides an added dimension of accuracy. This should be a valuable tool for the research community. It is being incorporated in a number of video demonstration tapes at GSFC and JPL.

11. Training

Gregor von Laszewski, who was a student in the first round team, graduated from Syracuse University. His PhD thesis title is A Parallel Data Assimilation System and its Implications on a Metacomputing Environment (October 1996).

Heather Kilcoyne graduated from the University of Maryland. Her MS thesis is titled "Comparison of the Data Assimilation Methodologies of Data Assimilation Office, the European Center for Medium Range Weather Forecasting, and the National Centers for Environmental Prediction" (September 1997). She is currently working at the DAO on the GEOS-3 end-to-end system.


Appendix A: Summary of GEOS-3 Core System

The following is taken from the publication by Lyster et al. (1997e) to appear in Supercomputing97.

GEOS DAS uses a grid-point based atmospheric general circulation model GCM [Takacs et al., 1994]. Other physical processes are computed using parameterized models for turbulence, short- and long-wave radiation, moisture, and land surface processes. Analysis algorithms are used to combine the inherently unstructured data, with (hopefully) known errors, into the structured models whose errors are also quantified. This can be regarded as a problem in the field of stochastic modeling and estimation. One of the difficulties of atmospheric data assimilation arises from the large size of the state space. Current GCMs have more than 107 gridpoints and there are approximately 105 meteorological observations (including satellite-retrieved temperature profiles) reported world wide in a typical six-hourly (or synoptic) assimilation period. The current analysis at the DAO is an on-line Quality Control (QC) algorithm and the Physical-space Statistical Analysis System (PSAS) [Cohn et al.,1997]. A key part of PSAS performs an iterative conjugate-gradient solve of a moderately dense (26% full) 105 x 105 matrix. The heritage code for the Core system is mostly FORTRAN 77.

Appendix B: Web references

Go to the GEOS-3 Software Plan

The Algorithm Theoretical Basis Document (ATBD) describes the basic algorithms of the Goddard Earth Observing System (GEOS) Data Assimilation System (DAS)

Download a paper [DAO, 1997] that describes the design of the GEOS-3 Data Assimilation System Architectural Design

Download a paper [Lyster et al., 1997a] that describes the design of the Parallel GEOS General Circulation Model (GCM)

Download a paper [Lyster et al., 1997b] that describes the design of the Parallel GEOS Physical-space Statistical Analysis System (PSAS)

Download a paper [Sawyer et al. 1997a] on the GEOS-3 Data Assimilation System Core Interface and Data Design

Visit the DAO Benchmarking Page

Further information about the work on parallel I/O can be found at Rob Lucchesi's GPIOS site

Further information about the HPCC ESS Grand Challenge PI work at the DAO can be found at Peter Lyster's Home Page

Download a paper [Lyster et al., 1997d] on the Parallel Kalman Filter, or go to L.P. Chang's Home Page to view some animations of Kalman filter assimilation runs.



General References

Cohn et al.,1997
Cohn, S. E., A. da Silva, J. Guo, M. Sienkiewicz, and D. Lamich (1997). Assessing the Effects of Data Selection with the DAO Physical-space Statistical Analysis System Technical Report DAO Office Note No. 96-08, NASA Goddard Space Flight Center, Greenbelt, Maryland. Submitted to Mon. Wea. Rev.

DAO, 1996
DAO staff (1996). Algorithm Theoretical Basis Document, version 1.01. Technical report, NASA Goddard Space Flight Center, Greenbelt, Maryland.

DAO, 1997
DAO staff (1997). GEOS-3 Data Assimilation System Architectural Design. DAO Office Note, NASA Goddard Space Flight Center, Greenbelt, Maryland.

Ding and Ferraro, 1995
Ding, C. H. Q.  and R. D. Ferraro (1995). An 18 Gflops parallel data assimilation PSAS package. In Proceedings of the Intel Supercomputer Users Group Conference, page 70.

Ding and Ferraro, 1996
Ding, C. H. Q.  and R. D. Ferraro (1996). Climate data assimilation on a massively parallel computer. In Proceedings of Supercomputing, 96.

Farrell et al., 1996, 1997
Farrell, W. E., A. J. Busalacchi, A. Davis, W. P. Dannevik, G-R. Hoffmann, M. Kafatos (1996, 1997). Reports of the Data Assimilation Office Computer Advisary Panel to the Laboratory of Atmospheres. Data Assimilation Office, NASA Goddard Space Flight Center, Greenbelt, Maryland.

von Laszewski, 1996
von Laszewski, G. (1996). The Parallel Data Assimilation System and its Implications on a Metacomputing Environment. PhD thesis, Syracuse University, Syracuse, New York.

Lyster et al., 1996
Lyster, P. M., J. W. Larson, J. Guo, W. Sawyer, A. da Silva, and I. Stajner (1996). Progress in the Parallel Implementation of the Physical-space Statistical Analysis System (PSAS), to be published in the proceedings of the European Center for Medium-Range Weather Forecasting workshop Making its Mark -- The Use of Parallel Processors in Meteorology, (1996).

Lyster et al., 1997a
Lyster, P. M., W. Sawyer, and L. Takacs (1997). Design of the Goddard Earth Observing System (GEOS) Parallel General Circulation Model (GCM). Technical Report DAO Office Note No. 97-13, NASA Goddard Space Flight Center, Greenbelt, Maryland.

Lyster et al., 1997b
Lyster, P. M., J. W. Larson, C. H. Q. Ding, J. Guo, W. Sawyer, A. da Silva, I. Stajner (1997). Design of the Goddard Earth Observing System (GEOS) Parallel Physical-space Statistical Analysis System (PSAS). Technical Report DAO Office Note No. 97-05, NASA Goddard Space Flight Center, Greenbelt, Maryland.

Lyster et al., 1997c
Lyster, P. M. (1997). Four Dimensional Data Assimilation: HPCC 10 Gigaflop/s Milestone Submission. Available as Peter Lyster Home page.

Lyster et al., 1997d
Lyster, P. M., S. E. Cohn, R. Ménard, L.-P. Chang, S.-J. Lin, and R. Olsen (1997). Parallel Implementation of a Kalman Filter for Constituent Data Assimilation. Mon. Wea. Rev., 125, 1674-1686. Available also as Technical Report DAO Office Note No. 97-02, NASA Goddard Space Flight Center, Greenbelt, Maryland.

Lyster et al., 1997e
Lyster, P. M., C. H. Q. Ding, K. Ekers, R. Ferraro, J. Guo, M. Harber, D. Lamich, J. W. Larson, R. Lucchesi, R. Rood, S. Schubert, W. Sawyer, M. Sienkiewicz, A. da Silva, J. Stobie, L. L. Takacs, R. Todling, J. Zero (1997). Parallel Computing at NASA Data Assimilation Office (DAO), accepted for publication in Supercomputing97, (1997). Also available at http://dao.gsfc.nasa.gov/DAO_people/lys/sc97/sc97/INDEX.htm.

Lyster 1997f
Lyster, P. M. (1997) Development of a Lagrangian Filter for Constituent Assimilation. Third Workshop on Adjoint Applications in Dynamic Meteorology, Bishop's University, Lennoxville, Canada, June 18, 1997.

Ménard et al., 1997
Ménard, R., L.-P. Chang, P. M. Lyster, and S. E. Cohn (1997). Stratospheric assimilation of chemical tracer observations using a Kalman filter. Submitted to J. Geophys. Res. Available also as Technical Report DAO Office Note No. 97-14, NASA Goddard Space Flight Center, Greenbelt, Maryland.

Pressman, 1993
Pressman, R. S. (1993). A Manager's Guide to Software Engineering. McGraw-Hill.

Sawyer et al. 1997a
Sawyer, W., A. da Silva, R. Lucchesi, P. M. Lyster, and L. L. Takacs (1997a). GEOS-3 Data Assimilation System Core Interface and Data Design (1997). DAO Intranet. The latest version of this can be downloaded compressed postscript

Sawyer et al. 1997b
Sawyer, W., L. L. Takacs, A. da Silva, and P. M. Lyster (1997b). Parallel Grid Manipulations in Earth Science Calculations (1997). Submitted to Computational Aerosciences in the 2lst Century (CAS21), January 1998.

Takacs et al., 1994
Takacs, L. L., A. Molod, and T. Wang (1994). Documentation of the Goddard earth observing system (GEOS) general circulation model-version 1. Technical Report NASA Technical Memorandum 104606, Vol. 1, NASA Goddard Space Flight Center, Greenbelt, Maryland.



Peter Lyster
Sunday September 14 21:24:51 EDT 1997
  • About this document ...