Distributed Visualization Task

Objective: The computational power of a centralized high performance computing facility can not be fully appreciated by remote users without an adequate data transmission capability to transfer the processed data in a timely manner. The network congestion is expected to persist based on the fact that the growth of the user community and the processor technology has been continuously ahead of the growth of the network technology. The transmission bottleneck currently prohibits distributed visualization applications which involve interactive display of a large volume of data. The objective of this task is to provide near real-time (> 1 frame per second) visualization capability to remote users by employing on-line data compression/ decompression technology.

Approach: Distributed Visualization involves data flow between a host computer and a client computer over a global network. When the host computer and the client computer are separated over a long distance, the data transmission time becomes too long (over a minute for a 512 by 512 image frame) to support an interactive data processing environment. (see picture, 90K). The compression technology is pursued based on the assumption that the reduction in transmission time from the compressed data volume is much greater than the time spent for compression/decompression. This task was approached in three stages:

Accomplishments: Several compression algorithms (JPEG, MPEG, EPIC) were examined with respect to the computational complexity and the compression performance. Trade off analysis of the computational load between compression and decompression was per-formed by examining various non-orthogonal transforms (forward and inverse transforms utilize different filters). A prototype system, compressor/ decompressor pair, was developed for inter-active animation sequence display via on-line compression and decompression combined with a TCP/IP socket based communication mechanism. The prototype system provides three types of transmission choices: no-compression, lossless compression (JPL's Rice algorithm), and lossy compression (Fast Wavelet Transform developed by Eric Majani at JPL). Employing the lossy compression method, over 30:1 data compression can be achieved on a typical visualization image frame with a minimal information loss.

Significance: Lossy data compression provides a wide range of compressibility that a user can manipulate to achieve the desired fidelity of the data for browsing, animation, and various visualization tasks. The prototype system allows a user to interactively visualize a large volume of data in near-real time over a slow network, thus providing a means to achieve a cost-effective distributed computing environment.

Status/Plans: The current implementation applies the compression algorithm on each frame independently. The evaluation of the motion compensation approach employed by the MPEG algorithm indicated that the inter-frame correlation of the animated sequence can not be removed using the motion tracking method developed for a video sequence. We plan to develop an inter-frame correlation removal algorithm to improve the compression performance and to provide an interactive visualization environment for remote HPCC users by integrating the compression/ decompression system with AVS (Advanced Visualization System).


Points of Contact:


Return to the PREVIOUS PAGE


curator: Larry Picha