SPEC/GWPG
Benchmarks
Download/Order
SPEC
Mirror Sites
Resources
|
The Graphics
Performance Characterization Project Group Rules
Version
2.11
Last
Updated: 04/20/2010
-
Overview
- Rules
Inheritance
- All rules declared in "The
Graphics and Workstation Performance Group (SPEC/GWPG): Rules
For Project Groups" document (known hereinafter as the GWPG Project Groups Ruleset)
shall apply, unless specifically overruled by a rule in this
document.
- Rules declared in this
document shall apply in addition to the rules declared in the
GWPG Project Groups Ruleset.
- General
Philosophy
- The Graphics Performance
Characterization Project of SPEC/GWPG (henceforth abbreviated
as SPECgpcSM) believes the user community will
benefit from an objective series of tests, which can serve as
common reference and be considered as part of an evaluation
process.
- The SPECgpc seeks to
develop benchmarks for generating accurate graphics performance
measures in an open, accessible and well-publicized manner.
- The SPECgpc wishes to
contribute to the coherence of the field of graphics
performance measurement and evaluation so that vendors will be
better able to present well-defined performance measures; and
customers will be better able to compare and evaluate vendors'
products and environments.
- The SPECgpc will provide
formal beta software to members and final software releases to
the public in a timely fashion.
- Hardware and software
used to run the SPECgpc benchmarks must provide a suitable
environment for running typical graphics programs.
- SPECgpc reserves the
right to adapt its benchmarks as it deems necessary to preserve
its goal of fair and useful benchmarking (e.g. remove
benchmark, modify benchmark code or data, etc). If a change is
made to the suite, SPECgpc will notify the appropriate parties
(i.e. SPECgpc members and users of the benchmark) and SPECgpc
will re-designate the metrics (e.g. changing the metric from
UGNX-01 composite to UGNX-02 composite). In the case that a
benchmark is removed in whole or in part, SPECgpc reserves the
right to republish in summary form "adapted" results
for previously published systems, converted to the new metric.
In the case of other changes, such a republication may
necessitate re-testing and may require support from the
original test sponsor.
- Overview of
Optimizations
- SPECgpc is aware of the
importance of optimizations in producing the best system
performance. SPECgpc is also aware that it is sometimes hard to
draw an exact line between legitimate optimizations that happen
to benefit SPECgpc benchmarks and optimizations that
specifically target SPECgpc benchmarks. However, with the list
below, SPECgpc wants to increase awareness of implementers and
end-users to issues of unwanted benchmark-specific
optimizations that would be incompatible with SPECgpc's goal of
fair benchmarking.
- To ensure that results
are relevant to end-users, SPECgpc expects that the hardware
and software implementations used for running SPECgpc
benchmarks adhere to a set of general rules for optimizations.
- General Rules
for Optimization
- Optimizations must
generate correct images for a class of programs, where the
class of programs must be larger than a single SPECgpc
benchmark or SPECgpc benchmark suite. Correct images are those
deemed by the majority of the SPECgpc electorate to be
sufficiently adherent to the respective graphics API
specification for the targeted end-user community (e.g. users
of OpenGL on PDAs would have lower quality expectations than
those using high-end workstations).
- Optimizations must
improve performance for a class of programs where the class of
programs must be larger than a single SPECgpc benchmark or
SPECgpc benchmark suite and applicable to at least one end user
application. For any given optimization a system must generate
correct images with and without said optimization. An
optimization must not reduce system stability.
- The vendor encourages the
implementation for general use (not just for running a single
SPECgpc benchmark or SPECgpc benchmark suite). As an indicator
that the implementation is suitable for general use, graphics
configurations submitted for the SPECgpc benchmark suite must
be able to run the corresponding SPECapc application benchmarks
if applicable.
- The implementation is
generally available, documented and supported by the providing
vendor.
- It is expected that
vendors would endorse the general use of these optimizations by
customers who seek to achieve good application performance.
- No pre-computed (e.g.
driver cached) images, geometric data, or graphics state may be
substituted within an SPECgpc benchmark on the basis of
detecting that said benchmark is running (e.g. pattern matching
of command stream or recognition of benchmark's name).
- Every OpenGL
implementation in both immediate and display list mode must
fully process every GL element presented to it that will impact
the frame buffer and GL state.
- Differences to the frame
buffer between immediate and display list modes must not exceed
0.01% of the number of pixels in the window.
- In the case where it
appears the guidelines in this document have not been followed,
SPECgpc may investigate such a claim and request that the
optimization in question (e.g. one using SPECgpc
benchmark-specific pattern matching) be removed and the results
resubmitted. Or, SPECgpc may request that the vendor correct
the deficiency (e.g. make the optimization more general purpose
or correct problems with image generation) before submitting
results based on the optimization.
- Benchmarks
- Benchmark
Acceptance
- Benchmark components are
defined as
- code sets (e.g.
SPECviewperf®),
- run rules, scripts and
associated data sets (e.g. viewsets).
- New or modified benchmark
components require a 2/3-majority vote of the SPECgpc
electorate to be accepted for publication.
- A minimum 3-week review
period is required for new or significantly modified benchmark
components.
- At the end of the review
period a vote will be called to approve the proposed changes.
- An amendment to a
benchmark component during the review period must be
unanimously accepted. If not, the review period shall be
restarted.
- It is the option of any
future SPECviewperf viewset author(s) to require passing of
selected conformance tests prior to submission of results for
that viewset.
- Benchmark Code
Versioning
- Benchmark code is defined
as the set of source code required to build and run a benchmark
executable (e.g. SPECviewperf).
- SPECviewperf Benchmark
code uses the following version coding: M.m.p (e.g. 8.0.1) M is
the major release number, m is the minor release number and p
is the patch level.
- The major release number
is only incremented when large amounts of code are changed and
the scripting language is dramatically changed as a result --
backward compatibility is highly unlikely when moving scripts
or data sets between major releases (e.g. running v2 scripts
on a v3 executable would almost certainly fail).
- The minor release number
is bumped if some small set of code is replaced or removed -
but the standard, unchanged scripts and data sets, as a whole,
must run on the new version (but perhaps with different
performance).
- Patch releases can
contain additions of new properties and additions of new
attributes to existing properties, but cannot change or remove
any existing properties, attributes or functionality. These
are typically used for bug fixes, small enhancements and so
forth.
- SPECviewperf
Viewset Versioning
- The version of a
SPECviewperf viewset should be incremented if:
- changes to SPECviewperf
affect the performance of the viewset,
- or changes to the
viewset script affect performance,
- or if the viewset data
changes,
- or if rule changes
affect the acceptance criteria.
- New results for the
previous version of a viewset will no longer be published.
- SPECviewperf
Benchmark Release
- On the release date of a new benchmark,
it replaces the previous benchmark on the public website. Submissions
for the previous benchmark will no longer be accepted.
-
Benchmark Run Rules
- Benchmark Run
Rules
- The system under test
must perform all of the respective graphics API's functionality
requested by the benchmark with the exception that the system
does not have to support dithering.
- The systems under test
must be OpenGL Conformant for the pixel format or visual used
by the benchmark.
- Settings for environment
variables, registry variables and hints must not disable
compliant behavior.
- No interaction is allowed
with the system under test during the benchmark, unless
required by the benchmark.
- The system under test can
not skip frames during the benchmark run.
- It is not permissible to
change the system configuration during the running of a given
benchmark. For example, one can not power off the system, make
some changes, then power back on and run the rest of the
benchmark.
- Screen grabs for
SPECviewperf will be full window size.
- The color depth used must
be at least 24 bits (true color), with at least 8 bits of red,
8 bits of green and 8 bits of blue.
- If a depth buffer is
requested, it must have at least 16 bits of resolution.
- The display raster
resolution must be at least 1280 pixels by 1024 pixels.
- The monitor refresh rate
must be at least 75Hz. This requirement does not apply to
digital flat panel displays.
- The monitor must support
the stated resolution and refresh rate and must fully display
all of the benchmark tests being submitted.
- Results to be made public
must be run by official scripts that may not be changed, with
the following exceptions (which must be documented if not the
default):
- In SPECviewperf:
- specific selection of
visual/pixel format on a per-test basis
- the multithreading flag
(-th) on approved multi-threading viewsets
- Visual/pixel format
required:
- May be selected on a
per-test basis by submission of the viewset script.
- If RGB visual/pixel
format is requested, it must have at least eight bits of red,
eight bits of green and eight bits of blue.
- If destination alpha is
requested, it must have at least 1 bit.
- If depth buffer is
requested, it must have at least 16 bits of resolution.
- Screen resolution must be
large enough to run the individual tests at their requested
window size, with no reduction or clipping of test window.
- Tests may be run with or
without a desktop/window manager, but must be run on some
native windowing system.
-
Submission and
Review Rules
- Submission
Content Rules
- These rules are specific
to SPECgpc and shall apply in addition to the Submission
Content Rules in the GWPG Project Groups Ruleset.
- A SPECviewperf submission
can be for one or more viewsets per configuration.
- The SPECviewperf binary
must be submitted if it is not one of the standard binaries.
- The
SPECviewperf submission upload file must have the structure
defined in Figure 1:
Figure
1
-
Submitters are not
required to submit depth images with a submission. Submitters
must retain depth images to be available upon request by any
committee member during the review period. After the review
period, submitters are not required to retain depth images.
- Submission
Process Rules
- These rules are specific
to SPECgpc and shall apply in addition to the Submission
Process Rules in the GWPG Project Groups Ruleset.
- The submission file names
must contain gpc_v for SPECviewperf, contain all lower case
letters and not contain '.' except prior to the zip or tar file
extension (e.g. intel_gpc_v_jun99_v0.zip). The file version is
denoted prior to the file extension. The initial file version
is v0. Resubmitted files must increment the version number.
- Review Period
Rules
- These rules are specific
to SPECgpc and shall apply in addition to the Review Period
Rules in the GWPG Project Groups Ruleset.
- Reviewers will decide if
the image quality of the submission is sufficiently adherent to
the respective graphics API's specification to satisfy the
intended end user's expectations. If a reviewer rejects the
quality of an image for a stated reason, the submitter can ask
for a vote of the full SPECgpc electorate. In case of a tie the
submission is rejected.
- System configurations
submitted for the SPECgpc benchmark suite must be able to run
the corresponding SPECapc application benchmarks if applicable.
If this criterion is not met the submission will be rejected.
|
Adoption
Changes
for version 1.1 adopted June 10, 1999
Changes for version 1.2 adopted January 12, 2000
V1.4 changes -- 5.02 (d) added
V1.5 changes -- 4.01.i.2(2), 4.01.i.2(4), 5.02.w
V1.6 changes -- 1.03.c, 5.04.o - Adopted by the SPECopc on January 23, 2003
V1.17 Adopted by the SPECopc on August 13, 2004
V1.18 Adopted by the SPECopc on October 21, 2004
V1.19 Adopted by the SPECopc on April 19, 2005
V1.20 Adopted by the SPECopc on October 20, 2005
V2.00 Adopted by the SPECopc on January 26, 2006
V2.10: Adopted on 13 September 2007 (reflects SPECopc->SPECgpc name change and wider API charter scope)
V2.11: Adopted on 20 April 2010 - Added section II.4 and rule II.4.a. |
|