|
|
Lee Holcomb |
|
Outlines HPCC's Directions |
by Jarrett Cohen
Issue 3, September 1997
Welcome to the on-line version of NASA's Insights Newsletter.
![]() Text Only |
Insights is published by the High Performance Computing and Communications (HPCC) Program Office. Address changes to Judy Conlon or write to: NASA HPCC Insights, Mail Stop 269-3, Moffett Field, California 94035-1000, USA
A first century sage once wrote that he became all things to all people to serve a nobler purpose. Lee Holcomb, NASA's new Director of Information Technology Strategy in the Office of Aeronautics and Space Transportation Technology, recognizes a similar dilemma in today's HPCC Program.
There "is a tension between those who see HPCC and the Internet as tools for the general public, the low end of the pyramid, versus those who see the long-term benefit to the public in investing in high-end users at the top of the pyramid," he said. "We need a balanced program that does both.
"We are investing substantially in trying to remove barriers for growth to the industrial sector in high-end computing, where there is a technical barrier to scaling to very high speeds, and in networking, where there is a need to scale up the global Internet and assure performance," Holcomb said.
Clearing scalability hurdles
Holcomb began managing NASA high-end computing in 1977
and has directed the HPCC effort from its 1991 inception.
Over these 20 years, he has overseen a migration from
building custom supercomputers, to acting as a "friendly
buyer" with industry, to employing Cooperative Agreements to
attain needed performance.
"It is effective if we can collaborate with industry and bring our scientists more intimately into the product evolution process," Holcomb said. The HPCC Program pioneered Cooperative Agreements at NASA with placement of the IBM SP2 scalable parallel testbed at Ames Research Center in 1994. "The agency has taken the Cooperative Agreement to new levels, including the billion-dollar X-33 agreement with Lockheed Martin," Holcomb said.
A Goddard Space Flight Center Cooperative Agreement with Silicon Graphics/Cray Research is pushing the barriers of scalable parallel performance. Included in the HPCC-funded effort is access to a 1,328-processor CRAY T3E-900 and programming assistance to reach 100 billion floating-point operations per second (100 gigaflops) sustained on Earth and Space Sciences (ESS) project applications. The ESS Grand Challenge teams also are providing their software, often compatible with several supercomputers, to the HPCC community.
"The first two or three versions of parallel machines suffered because of the proliferation of vendors and the different flavors of operating systems and message passing protocols, differing performance of compilers, and the inability to move code from one machine to another, even on one vendor's series," Holcomb said. "That problem has lessened somewhat. Machines have systems software with the ability to migrate code among processors on the machine or among several machines." He cited the MetaCenter operation of the IBM SP2s at Ames and Langley Research Centers as another success in this area.
"There is a recognition that scalable computing is here today, and it works. U.S. companies are dominating a growing and expanding mid-range supercomputing market. That is good news," Holcomb said. "The bad news is when you extrapolate the scalable technology, performance runs into a wall at a thousand to a few thousands of processors. Serious research is needed to give us a foundation so that there are no barriers."
Increasing scalability is the core strategy of petaflops (one million billion floating-point operations per second) system studies co-funded by NASA, the Defense Advanced Research Projects Agency and the National Science Foundation. "They include new, revolutionary architectures such as processor-in-memory and the use of threads," Holcomb said. "We will be looking for any architectural concepts that will provide hooks for software engineers to improve scalability and performance."
Juggling pinnacle and base
Pushing this "pinnacle computing," as Holcomb calls it,
to ever-greater heights has been an HPCC focus for its
six-year history. It has substantially changed the aerospace
basic design and manufacturing process and has brought ocean
modeling to 10 mile-by-10 mile resolution. Across almost all
the industry sectors, supercomputers have improved
productivity and shortened the time cycle, he said.
With an eye toward spreading such results, Holcomb said
"there is an emerging pressure in HPCC to serve and expand
the base of the pyramid, which is the broader, societal
user."
A new player in this juggling act is the Presidential Advisory Committee on Information Technology. (See President Clinton names advisory committee in Issue 2 of INSIGHTS.) A major emphasis is the Next Generation Internet (NGI), which aims to increase data transfer speeds 1,000-fold over those of today. (See The Internet of the not-so-distant future.)
In June, Congressional briefings and hearings began addressing the roles of government and industry, Holcomb explained. "I know agencies are pushing very hard for the NGI program. In part, the NGI is built on the premise that it is critical to agency missions."
Holcomb pointed to five NASA mission-driven applications for the NGI:
1. The Astrobiology Institute -- a collaborative
science institute with no walls.
The Next Generation Internet will further
government agency missions in serving both researchers and
the general public. From aerospace design tools to
telemedicine, NASA is investigating how to use
high-bandwidth networks productively.
2. Aerospace vehicles -- intelligent, integrated
design and synthesis tools that support the entire life
cycle of development and use.
3. Mission to Planet Earth -- the ability of
scientists and other interested parties to access and
analyze Earth systems data for understanding global
change.
4. Virtual access to facilities -- computational
laboratories, telescopes, flight simulators.
5. Telemedicine -- interactive consultation, remote
protocols and procedures.
The infrastructure for testing these applications is the NASA Research and Education Network (NREN), which is moving to 622 million bits-per-second this year. "We want to understand how to use high-bandwidth networks productively," Holcomb said. "How do you speed up end-to-end performance? How do you move many concurrent low-bandwidth applications and not bring the network down? How do you simultaneously serve the community that needs high bandwidth and those that don't need as much?
"We see that the quality of service is falling off as well as the reliability and integrity of a global Internet with hundreds of millions of access points," Holcomb continued. "NREN plays a role in providing the fabric to bring up the performance and to do the research on new network protocols."
An information revolution
NASA HPCC has enabled public Internet usage with its
Information Infrastructure Technology and Applications
(IITA) project, with some supported World Wide Web sites
getting 400,000 hits per day. A significant component of
IITA is Learning Technologies.
"Early on, the agenda was to promote the Internet in schools through videos, training, etc.," Holcomb said. "We did that rather effectively. The Internet has grown in schools beyond our wildest hopes.
"A second goal was to provide low-cost access and advice on how to connect schools," Holcomb related. "We provided 800 dial-up service to thousands of schools. Low-cost access concepts included Langley's hub access point, using a server to aggregate Internet requests. This program improved service by an order of magnitude, and for far less money. Now, there is an industrial sector providing low-cost Internet access."
With widespread access on its way to being achieved, Holcomb said the emphasis must move from hardware to content and pedagogy, especially professional development of educators. "Right now, the expert is usually the third grader!" he said.
"This will become more of our core business -- educating educators in how to use the Internet for science and mathematics and providing modules that can be used effectively and easily," Holcomb explained. "To go with that is a rigorous program in evaluation to determine the value and effectiveness of those modules."
Holcomb stressed that kids must learn to use information technology. "We are in a transformation where it will be essential in everyone's lives. Just as the industrial revolution was a change, the information revolution will be a change. We have talked within NASA about moving to a high-end computing power grid. You have a problem to solve and you don't care where you solve it -- the grid routes your program where it needs to go. That vision might transfer to an information power grid, where the resources you need come off the net.
"We are trying to change the paradigm and make the computing resources on the Internet something everyone uses. We'll be spending quite a bit of time and effort in the marketplace trying to facilitate this cultural revolution."
[Return to Top] [Return to Cover Page]