Publication
AIAA Aerospace Sciences Meeting and Exhibit 2001
Conference paper

A tractable approach to understanding the results from large-scale 3D transient simulations

View publication

Abstract

The data generated from large-scale scientific simulations, like the Department Of Energy's ASCI (Accelerated Strategic Computing Initiative) problems or NASA's HPCC (High Performance Computing & Communication) grand challenges, can easily and quickly overwhelm any storage resources available. This data size issue inflicts a huge burden on those tasked in assisting the scientists, engineers and designers in understanding the results from these simulations. Following the traditional post-processing visualization approach it becomes clear that the computer, graphics and disk requirements point one to the acquisition of a large visualization workstation with terabytes of striped disk arrays. Additional effort is required in compressing the data so that a single calculation can reside on this visualization machine. The end result can limit the interactive nature of the visualization when so much data needs to be pulled off of disk and then expanded. Often with a transient problem, sub-sampling (in time) is performed. If the algorithm used for post-processing requires small time-steps between snap-shots (e.g. particle traces) then the results can be misleading due to inaccuracy or simply fail due to numerical divergence. Also, a transition from one spatial topology to another (often an event of great importance) can easily be missed. Another, and significant problem with traditional post-processing is that the results do not answer the investigators questions directly, but require inference. Another option, the one suggested in this paper, is not to perform traditional scientific visualization on the data but feature extraction. If designed properly, the results can directly answer the scientist's or engineer's questions and reduce the data to be stored by as much as five orders of magnitude. When this is coupled directly with the simulation (co-processing) the overall size of the output data set with temporal sub-sampling will be able to reside on a normal workstation. Plus, the entire resultant feature extracted data-set will probably fit on the large visualization machine for playback with high temporal fidelity. © 2001 by Robert Haimes & Kirk Jordan.

Date

Publication

AIAA Aerospace Sciences Meeting and Exhibit 2001

Authors

Share