Managing Large Scale Data for Earthquake Simulations
Marcio Faerman, Reagan W. Moore, Jean B. Minster, & Philip J. MaechlingPublished February 1, 2007, SCEC Contribution #969
The Southern California Earthquake Center digital library publishes scientific data generated by seismic wave propagation simulations. The output from a single simulation may be as large as 50 Terabytes of data and 400,000 files. The total size of the digital library is over 130 Terabytes with nearly three million files. We examine the data Grid capabilities required to manage this scale of simulation output, and the implementation of these capabilities in a digital library built on top of the Storage Resource Broker data Grid. We also examine the capabilities that will be needed in the future as the scale of simulation output increases to petabytes of data.
Citation
Faerman, M., Moore, R. W., Minster, J. B., & Maechling, P. J. (2007, 2). Managing Large Scale Data for Earthquake Simulations. Presentation at HPDGrid 2006. doi: 10.1007/s10723-007-9072-x.