[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Request for Benchmark Applications
Dear NERSC PI,
The procurement of the next generation computing system was recently
initiated at the National Energy Research Scientific Computing Center
(NERSC). The first phase of deployment is scheduled for as early as
calendar year 1999. An essential evaluation criteria in this
procurement will be the performance of a wide range of applications
from the NERSC client community. We are now soliciting key
representative applications from various disciplines in Energy
Research to augment our existing benchmark suite. An application
submitted for consideration may be included if it broadens the
benchmark suite to represent a wider variety of ER codes and satisfies
the benchmark submission guidelines below. The NERSC procurement team
reserves the right to the final decision for inclusion.
If you have a suitable application and would like to contribute it,
please read the submission guidelines below. Please bear in mind that
transforming an application into a suitable benchmark is a significant
amount of work for both you and the the NERSC procurement team.
Please respond with expressions of interest to Adrian Wong
(atwong@nersc.gov) by March 6th 1998.
We recognize that the benchmark results on new architectures are
especially interesting to you, the code owner. However, please note
that vendors generally consider benchmark results to be proprietary,
and we may not be able to pass on the results to you.
Guidelines for Submitting Benchmark Applications
------------------------------------------------
These are general guidelines and not absolutely mandatory. The NERSC
benchmarking team will consider all submissions and make an assessment
on suitability. If an application is important and suitable, the NERSC
benchmarking team will work as much as possible to include it in the
suite.
1. Benchmark Documentation
Documentation that describes the benchmark application, its
unique computational characteristics (e.g., integer intensive) and
how it is representative of the computational requirements for
the scientific discipline.
2. Installation Notes
Notes on compilation and linking should be included where
necessary with makefiles to automatically build executables.
3. Execution Notes
Notes on the optimal execution environment, number of processors
and overall parallel scalability.
4. Portability
Application source code should be comply as much as possible to
language standards for FORTRAN-90 and ANSI C/C++.
5. Libraries
Only standard libraries such as MPI-1, BLAS and LAPACK should be
assumed externally available. The source code for all other
specific libraries should be submitted and considered integral
to the submission. Platform-specific library routines should be
avoided.
6. Input Data Sets
Only input data sets of a manageable size (<100 MB) should be
submitted. Ideally, at least two input data sets should be
included; a small test run to verify correctness and a more
substantial example that will execute for a reasonable
period. For example, a substantial run may take 30 to 60
minutes on the Cray T3E.
7. Output Files
Corresponding output files that are known to have correct
results should also be submitted for quality assurance. A script
that scans the output for key results and can indicate
correctness would be useful. Otherwise, a note that indicates key
fields in the output that could be used for checking should be
included.
The benchmark submission should consist of a single tar file that
includes all necessary source code for the application and libraries,
makefiles, shell scripts for execution, input data with corresponding
output files, and benchmark documentation and installation and
execution notes as described above. The benchmarking team will assess
the submission based on the above guidelines.
Please contact Adrian Wong (atwong@nersc.gov) if you have any
further questions.
Sincerely,
Adrian Wong
NERSC Procurement Team
------------- End Forwarded Message -------------