Managing Large Scale Data for Earthquake Simulations
Marcio Faerman, Reagan W. Moore, Yifeng Cui, Yuanfang Hu, Jing Zhu, Jean B. Minster, & Philip J. MaechlingPublished September 2007, SCEC Contribution #1116
The Southern California Earthquake Center digital library publishes scientific data generated by seismic wave propagation simulations. The output from a single simulation may be as large as 50 Terabytes of data and 400,000 files. The total size of the digital library is over 130 Terabytes with nearly three million files. We examine the data Grid capabilities required to manage this scale of simulation output, and the implementation of these capabilities in a digital library built on top of the Storage Resource Broker data Grid. We also examine the capabilities that will be needed in the future as the scale of simulation output increases to petabytes of data.
Citation
Faerman, M., Moore, R. W., Cui, Y., Hu, Y., Zhu, J., Minster, J. B., & Maechling, P. J. (2007). Managing Large Scale Data for Earthquake Simulations. Journal of Grid Computing, 5(3), 295-302. doi: 10.1007/s10723-007-9072-x.