SPEChpc™ 2021 Large Results -- Help
Help for SPEChpc™ 2021 Large Results
This is a powerful engine for fetching results from SPEC.
There are two interfaces to this engine:
- Simple Interface
- The simple interface has a very limited interface
and relies heavily on default settings.
It can handle many basic inquires with minimal fuss.
- Configurable Interface
- The configurable interface offers a lot more functionality.
With this interface, it is possible to:
- select which columns you wish to see,
- limit the records returned by regular expressions and/or
numeric criteria over multiple columns
- specify a multiple key sort ordering,
- choose which quarterly publications to pull results from,
- and pick one of three formats for the results display.
Most of this help information is for the configurable interface.
These are the fields available
in the current configuration (hpc2021_lrg).
Each configuration is likely to have
a different set of fields available.
[Note: there may be multiple configurations,
each with different fields, for any set of results.
Typically, the more specific a configuration is
to a particular benchmark,
the more fields that are available.]
- Benchmark
- Which benchmark is this a result for.
- Hardware Vendor
- The hardware vendor for the system under test
- System
- The name of the system tested.
- Result (Peak)
- The single-figure-of-merit summary metric.
- Result (Base)
- The baseline (less aggressive) summary metric.
- Compute Cores Enabled
- The number of compute cores used
- # Chips
- The total number of chips in the system.
- Compute Threads Used
- The total number of compute threads
- Compute Nodes Used
- The total number of compute nodes used
- Node Hardware Vendor(s)
- The hardware vendor(s) for the compute nodes used in the system under test
- Node Model(s)
- The hardware model(s) for the compute nodes used in the system under test
- Node CPU Name(s)
- The CPU(s) used in the compute nodes
- Node Accelerator Model(s)
- The hardware accelerator(s) used in the compute nodes
- Node Accelerator Vendor(s)
- The vendor(s) of the hardware accelerator(s) used in the compute nodes
- Interconnect Vendor(s)
- The vendor(s) of the interconnect(s) used in the compute nodes
- Interconnect Model(s)
- The interconnect model(s) used in the compute nodes
- Interconnect Topologies
- The interconnect topologies used in the compute nodes
- Node OS
- The operating system used in the compute nodes
- Memory
- The total amount of main memory in the system under test.
- Compiler(s)
- The name and version of the compiler(s) (and associated software) used.
- MPI Library
- The name and version of the MPI library used.
- HW Avail
- The date that the hardware for this system is/will be generally available.
- SW Avail
- The date that the software used for this result is/will be generally available.
- System Class
- System class: homogeneous/heterogeneous
- Base Ranks
- The number of base MPI ranks used.
- Max Base Ranks
- The maximum number of base MPI ranks used.
- Min Peak Ranks
- The minimum number of peak MPI ranks used.
- Max Peak Ranks
- The maximum number of peak MPI ranks used.
- Base Parallel Model
- Parallel model (e.g. OMP) used in base results
- Peak Parallel Model
- Parallel model (e.g. OMP) used in peak results
- License
- The number of the license used to generate this result.
- 805 Base
- The required base ratio for the 805.lbm_l benchmark.
- 805 Peak
- The optional peak ratio for the 805.lbm_l benchmark.
- 818 Base
- The required base ratio for the 818.tealeaf_l benchmark.
- 818 Peak
- The optional peak ratio for the 818.tealeaf_l benchmark.
- 819 Base
- The required base ratio for the 819.clvleaf_l benchmark.
- 819 Peak
- The optional peak ratio for the 819.clvleaf_l benchmark.
- 828 Base
- The required base ratio for the 828.pot3d_l benchmark.
- 828 Peak
- The optional peak ratio for the 828.pot3d_l benchmark.
- 834 Base
- The required base ratio for the 834.hpgmgfv_l benchmark.
- 834 Peak
- The optional peak ratio for the 834.hpgmgfv_l benchmark.
- 835 Base
- The required base ratio for the 835.weather_l benchmark.
- 835 Peak
- The optional peak ratio for the 835.weather_l benchmark.
- Tested By
- The people who have produced this result.
- Test Sponsor
- The name of the organization or individual that sponsored the test.
- Test Date
- When this result was measured.
- Published
- The date this result was first published by SPEC.
- Updated
- The date this result was last updated by SPEC, though most updates are clerical rather than significant.
- Disclosure
- Full disclosure report including all the gory details.
[Note: there are two kinds of query forms:
Simple and
Configurable.
Most of the features described here are available
only in the configurable query.]
- Content: Display
- What fields to display.
Each field can be
Display
which will display the entire field,
or SKIP
which will cause
the field not to be displayed.
For fields of the string type,
it is also possible to limit the width of the field's display
by choosing one of the X Chars
options.
- Content: Criteria
- Limit results to only those that satisfy some criteria.
For each field it is possible to specify some criteria
that will be used to select only certain records
out of the entire dataset.
String criteria can be regular expressions,
numeric fields are compared against your provided
floating point values,
and date fields are compared against
the specified month and year.
You may specify criteria for and and all fields,
whether or not that field will be displayed.
- Content: Duplicates
- Allows the removal of duplicates, such as
where there are multiple results for the same configuration.
Duplicates are defined to be records that all have
matching values in across a specified set of fields.
Duplicates are then ranked according to
their values in a specified key field.
There are three possible actions for duplicates:
return all records (the default),
return the one result with the latest (or greatest) value,
return the one result with the earliest (or smallest) value.
- Content: Publication
- Specify in which datasets to look for results.
All SPEC results are published on a quarterly basis,
This allows you to specify the range of quarters
that you are interested in.
Note: there are some quarters where
no results were published for certain benchmarks;
datasets which would have no available results
will not be present in the selection list.
The default settings are for all quarters to be searched.
- Sorting: Column
- This search engine returns its findings in sorted order,
this ordering is based upon any three keys:
a
primary
, a secondary
,
and if records are still even, a tertiary
key
is used to settle ties.
- Sorting: Direction
- For each sort column, you must specify a direction.
Ascending
means that the list starts at
the lowest value ("AA", or "0", or "Jan-80"), and
Descending
starts the list
from the highest values.
- Format: Output Format
- Results may be returned in one of three formats.
HTML3.2 Table
- Which uses HTML table specifications
which allows your browser to arrange the display.
Preformatted Text
- Which makes the server format
the display of the data returned.
This is most useful when a large number of fields
are to be returned because most browsers do not perform
well when there are a large number of columns in a table.
Comma Separated Values
(CSV)
- Which may not look pretty, but if saved to a local file,
can are easily loaded into any spreadsheet application
and you can arrange and format and calculate to your heart's
delight
This search engine is designed to be controlled by two basic parts:
the configuration used, and the datasets searched.
The configuration controls many of the aspects of this engine.
It specifies which datasets are appropriate,
and which views of those datasets are supported.
The datasets contain the available data for published results.
SPEC breaks its publications into quarterly 'buckets';
thus there is a different dataset for each calendar quarter.
This allows you to select how far back into history you want to go.
If you want only the last year,
specify a range covering the last four quarters;
if you want to know about results performed
during the last half of 1995,
you may specify the range covering
the September and December issues in '95; etc.
The default settings cover all available quarters.
There may be multiple configurations for the same datasets.
Typically, the more focused a configuration is
towards a particular benchmark,
the more information about each result is available.
In other words, the summary configuration views
commonly support only the highest level information about a system
and its result;
the more specific configuration would support columns including
system configuration details,
the specific software versions,
and/or individual component benchmark results.
Finally, most configurations support links to the reporting page
for each result as the last column of the data returned.
These reporting pages (available in a variety of formats),
contain the full disclosure for each particular result.
Consult these pages to learn all the details about a result.
This engine supports five different modes of operation:
Help
- The current mode, what you are looking at right now.
Displays the available help information about the engine
and descriptions of the fields in the current configuration.
Simple
- The starting interface.
Offers a simple form for obtaining results
using mostly default settings.
Form
- The configurable interface.
Offers a very configurable interface to the available results.
Fetch
- The main workhorse.
Takes the configuration and settings from the
simple
and form
interfaces,
and performs the desired lookups and displays the results.
Dump
- Brute force.
Changes settings to return all available data
and then calls
fetch
.
Returns all data in the current configuration;
more data may be available in other configurations,
and all the details are in each result's disclosure page.
Because this is usually more data than browsers can handle
as tables, these dumps are available in only two forms:
preformated text
, which can be easily scrolled,
and comma separated values
, which can be saved locally
and loaded into a spreadsheet application.
Further Assistance
If you have comments or questions not addressed here,
please
contact the SPEC office.
Goto:
[Home]
[SPEC]
[OSG]
[HPG]
[ISG]
[EG]
[GPC]
[Benchmarks]
[Results]
[Submitting OSG Results]
[Submitting HPG Results]
webmaster@spec.org
Thu Dec 26 12:25:42 2024