Implementing High Performance Computing Simulation Software Acceptance Tests as Scientific Workflows
Philip J. Maechling, Ewa Deelman, & Yifeng CuiPublished 2008, SCEC Contribution #1253
In this paper, we describe a workflow-based method for performing acceptance testing of high performance computing (HPC) simulation software in a heterogeneous computing environment. The method we present supports verification of computational modeling software and application of this technique may lead to improved HPC programmer productivity through automation of necessary but routine tasks. HPC codes are frequently modified as new capabilities are added, efficiency is improved, and the software is ported to new platforms. Acceptance testing of the software after each modification is desirable. Acceptance tests for HPC simulation codes commonly involve a multi-step process of configuring a reference problem, running a simulation, and comparing simulation results against a reference solution. To increase the use of acceptance testing during simulation software development, we designed a way to express acceptance tests of simulation software as scientific workflows. This technique utilizes existing grid-based workflow tools to automate the task of configuring, running, and analyzing simulation-based reference problems in a grid-based HPC computing environment. Our technique introduces an acceptance test oracle that is used during the workflow construction process to define specific tests cases, maintain information about the required inputs, the reference solution, and the method for comparing the simulation results against the reference solution. As a case study, we apply our method to acceptance testing of a geophysical HPC computational modeling code and we show how the method supports acceptance testing of this code on several heterogeneous HPC computer resources.
Citation
Maechling, P. J., Deelman, E., & Cui, Y. (2008). Implementing High Performance Computing Simulation Software Acceptance Tests as Scientific Workflows. Presentation at International Symposium on Software Testing and Analysis 2008.