Results
SPECapc
Download/Order
Resources
|
The Application Performance Characterization Project
Committee Rules
Version 2.05
Last Updated: 04/18/2007
-
Overview
- General
Philosophy
- All rules declared in "The
Graphics Performance Characterization Group (SPEC/GPC):
Rules For Project Groups" document (known
hereafter as the GPC Project Groups Ruleset)
apply, unless specifically overruled by a rule
in this document.
- General
Philosophy
- Within
SPEC's Graphics Performance Characterization
(GPC) Group there was a strong belief that it
is important to benchmark graphics performance
based on actual applications. Application-level
benchmarks exist, but they are not standardized
and they do not cover a wide range of application
areas. Thus, the Application Performance Characterization
(SPECapcSM) Project was created
within the GPC to create a broad-ranging set
of standardized benchmarks for graphics-intensive
applications.
- The
SPECapc seeks to develop benchmarks for generating
accurate application-level graphics performance
measures in an open, accessible and well-publicized
manner.
- The
SPECapc wishes to contribute to the coherence
of the field of application performance measurement
and evaluation so that vendors will be better
able to present well-defined performance measures
and customers will be better able to compare
and evaluate vendors products and environments.
- The
SPECapc will provide formal beta benchmarks to
members and final benchmark releases to the public
in a timely fashion.
- Hardware
and software used to run the SPECapc benchmarks
must provide a suitable environment for running
typical (not just benchmark) workloads for the
applications in question.
- SPECapc
reserves the right to adapt its benchmarks as
it deems necessary to preserve its goal of fair
and useful benchmarking (e.g. remove benchmark,
modify benchmark code or data, etc). If a change
is made to the suite, SPECapc will notify the
appropriate parties (i.e. SPECapc members and
users of the benchmark) and SPECapc will re-designate
the benchmark by changing its name and/or version.
In the case that a benchmark is removed in whole
or in part, SPECapc reserves the right to republish
in summary form "adapted" results for
previously published systems, converted to the
new metric. In the case of other changes, such
a republication may necessitate re-testing and
may require support from the original test sponsor.
- Overview
of Optimizations
- SPECapc
is aware of the importance of optimizations in
producing the best system performance. SPECapc
is also aware that it is sometimes hard to draw
an exact line between legitimate optimizations
that happen to benefit SPECapc benchmarks and
optimizations that specifically target SPECapc
benchmarks. However, with the list below, SPECapc
wants to increase awareness of implementers and
end-users to issues of unwanted benchmark-specific
optimizations that would be incompatible with
SPECapc's goal of fair benchmarking.
- To
ensure that results are relevant to end-users,
SPECapc expects that the hardware and software
implementations used for running SPECapc benchmarks
adhere to a set of general rules for optimizations.
- General
Rules for Optimization
- Optimizations
must generate correct images and results for
the application under test, for both the benchmark
case and similar cases. Correct images and results
are those deemed by the majority of the SPECapc
electorate, potentially with input from the associated
independent software vendor (ISV) and/or end-users,
to be sufficiently adherent to the intent behind
the application.
- Optimizations
must improve performance for a class of workloads
where the class of workloads must be larger than
a single SPECapc benchmark or SPECapc benchmark
suite.
- For
any given optimization a system should generate
correct images with and without said optimization.
An optimization should not reduce system stability.
- The
vendor encourages the implementation for general
use (not just for running a single SPECapc benchmark
or SPECapc benchmark suite).
- The
implementation is generally available, documented
and supported by the providing vendor.
- In
the case where it appears that the above guidelines
have not been followed, SPECapc may investigate
such a claim and request that the optimization
in question (e.g. one using SPECapc benchmark-specific
pattern matching) be removed and the results
resubmitted. Or, SPECapc may request that the
vendor correct the deficiency (e.g. make the
optimization more general purpose or correct
problems with image generation) before submitting
results based on the optimization.
- It
is expected that system vendors would endorse
the general use of these optimizations by customers
who seek to achieve good application performance.
- No
pre-computed (e.g. driver-cached) images, geometric
data, or state may be substituted within an SPECapc
benchmark on the basis of detecting that said
benchmark is running (e.g. pattern matching of
command stream or recognition of benchmark's
name).
-
Benchmarks
- Benchmark
Acceptance
- Benchmark
components are defined as
- specific
revision of an application,
- run
rules, scripts and associated data
sets.
- New
or modified benchmark components require a 2/3-majority
vote to be accepted for publication. Selection
of datecode versions of a specific revision of
an application is by majority vote.
- A
minimum 3-week review period is required for
new or significantly modified benchmark components.
- At
the end of the review period a vote will be called
to approve the proposed changes.
- An
amendment to a benchmark component during the
review period must be unanimously accepted. If
not, the review period shall be restarted.
- Benchmark
Code Versioning
- Benchmarks
use the following version coding: M.m (e.g. SPECapcSM for
Pro/ENGINEER 20.0 v1.1) M is the major release
number and m is the minor release number.
- The
major release number is only incremented when
large amounts of code are changed and the scripting
language is dramatically changed as a result
-- backward compatibility is highly unlikely
when moving scripts or data sets between major
releases (e.g. running v2 scripts on a v3 executable
would almost certainly fail).
- The
minor release number is bumped if some small
set of code is replaced or removed - but the
standard, unchanged scripts and data sets, as
a whole, must run on the new version (but perhaps
with different performance).
- When
there is a new major release of a benchmark,
submissions using the previous release will be
accepted for at least one submission cycle. When
a superceded benchmark is retired, the associated
run-rules will be archived in an "archived
benchmark run-rules document" posted on
the Previous SPECapc Benchmarks web-page.
-
Submission,
Review and Publication
- General
Benchmark Run Rules
- The
system under test must correctly perform all
of the operations being requested by the application
during the benchmark.
- No
changes to any files associated with the benchmark
are permitted excepted as noted in the benchmark-specific
rules (section 5 of this document).
- The
entire display raster must be available for use
by the application being benchmarked.
- It
is not permissible to override the intended behavior
of the application through any means including,
but not limited to, registry settings or environment
variables.
- No
interaction is allowed with the system under
test during the benchmark, unless required by
the benchmark.
- The
system under test can not skip frames during
the benchmark run.
- It
is not permissible to change the system configuration
during the running of a given benchmark. That
is, one can't power off the system, make some
changes, then power back on and run the rest
of the benchmark.
- Results
submitted must be obtained using the scripts,
models, and application revisions which are specified
for that submission cycle by the SPECapc.
- The
color depth used must be at least 24 bits (true
color), with at least 8 bits of red, 8 bits of
green and 8 bits of blue, unless otherwise specified.
- The
display raster resolution must be at least 1280
pixels by 1024 pixels, unless otherwise specified.
- The
monitor refresh rate must be at least 75Hz. This
requirement does not apply to digital flat panel
displays, unless otherwise specified.
- The border width
of the windows created during the benchmark shall
not exceed 10 pixels, unless otherwise specified.
- The
monitor used in the benchmark must support the
stated resolution and refresh rate.
- The
benchmark must successfully obtain all requested
window sizes, with no reduction or clipping of
any benchmark-related windows. Windows
created by the benchmark must not be obscured
on the screen by anything other than other elements
created by the benchmark.
- Tests
may be run with or without a desktop/window manager
if the application allows this, but must be run
on some native windowing system.
- General
Submission Content Rules
- The
submission upload file structures are defined
in the benchmark-specific section below.
- Submission
Process Rules
- The
submission file names are detailed below under
the benchmark-specific rules.
- Review
Period Rules
- Reviewers
will decide if the image quality and results
of the submission are sufficiently correct with
respect to the intent of the ISV to satisfy the
intended end-users' expectations
-
SPECapc
Benchmark Specific Rules and Procedures
- Solid
Edge V14
- The
benchmark must be run using Solid Edge V14.00.00.70
- The
application can be run with Tools-Options-View
Graphics Display set to Graphics Card Driven,
Software Driven or Backing Store. The choice
must be documented in the notes portion of the
results.
- Application
settings must not be changed from the defaults
set by the benchmark installation. Settings
that must not be changed include, but are not
limited to:
- Culling
- Wireframe
display in Move Part command
- Arc
smoothness (3)
- The
submission must contain the file result.txt that
is generated during the benchmark run.
- The
directory structure of the submission must
be as follows:
.../company-name/system_1/SEV14/result.txt
.../company-name/system_2/SEV14/result.txt
etc...
- The
submission file must be named company_apc_SolidEdgeV14_vN.zip
where company is the member company or
organization name in lower case and vN is
the file version (e.g. hp_apc_SolidEdgeV14_v0.zip.)
The initial submission is v0. Resubmitted files
must have the version number incremented.
- Solidworks
2005
- The
benchmark must be run using Solidworks2005 service
pack 0.
- The
application window size must not be changed from
its initial size.
- The
submission must contain a results.txt file generated
by running the benchmark. Its contents are extracted
from the file apcresultsN.txt, generated
by benchmark GUI , where N is the
number of the benchmark test run.
- The
submission will be derived from the best composite
generated by the default 5 benchmark runs, as
controlled and reported by the benchmark GUI.
- The
appearance of the the application's Quick Tips
/ Dynamic Help box, or other pop-ups that do
not dismiss themselves, will cause the benchmark
run to be invalid. The benchmark FAQ has
information on how to prevent this.
- The
directory structure of the submission must
be as follows:
.../company-name/system_1/sw2005/results.txt
.../company-name/system_2/sw2005/results.txt
etc...
- The
submission file must be named company_apc_sw2005_vN.zip
where company is the member company or
organization name in lower case and vN is
the file version (e.g. ibm_apc_sw2005_v0.zip.)
The initial submission is v0. Resubmitted files
must have the version number incremented.
- 3dsmax7
- The
benchmark must be run using 3ds max version 7,
service pack 0 (i.e. unpatched).
- The
application windows will be visible as specified:
- The "3ds
max 7" window is maximized and fills
the screen (The window is not occluded
by any other windows.) Note: Task-bar
auto-hide should be enabled.
- Drivers
and custom drivers must be configured (explicitly
or implicitly) to use:
- GL_TEXTURE_MIN_FILTER
= GL_LINEAR_MIPMAP_LINEAR
- GL_TEXTURE_MAG_FILTER
= GL_LINEAR
- Textures
cannot be modified, i.e. reduced in size
or depth.
- Wireframe
objects must NOT be drawn using triangle
strips.
- The
following Viewport parameters must be checked
(i.e. enabled) in the Viewports Tab found in
the 3ds max 7 Customize->Preferences menu
(Viewports tab):
- Backface
cull on object creation
- Mask
viewport to safe region
- Update
background while playing
- Display
world axis
- The
following Viewport parameters must NOT be checked
in the 3ds max 7 Customize->Preferences menu
(Viewports tab):
- Attenuate
lights
- Filter
environment backgrounds
- The
benchmark can be run using either OpenGL or Direct
3D.
- For
OpenGL submissions, an application graphics
driver or plug-in may be used, but the images
captured from the benchmark must not differ
from the supplied 3ds max OpenGL driver's
images by more than 5%. If the application
driver or plug-in’s images do not match,
the submitter may apply for a waiver based
on documented errors in the 3ds max OpenGL
driver. The availability and support of an
application driver or plug-in shall not affect
the submission's “single supplier” or “parts
built” status.
- The
submission must contain the files result.txt
and result.xls that are generated during the
benchmark run.
- The
directory structure of the submission must
be as follows:
.../company-name/system_1/3dsmax7/result.txt
.../company-name/system_1/3dsmax7/result.xls
.../company-name/system_2/3dsmax7/result.txt
etc...
- The
submission file must be named company_apc_3dsmax7_vN.zip
where company is the member company or
organization name in lower case and vN is
the file version (e.g. hp_apc_3dsmax7_v0.zip.)
The initial submission is v0. Resubmitted files
must have the version number incremented.
- Submitters
are not required to include pixel-comparison
images.
- A
committee member raising a challenge to a submission
can employ pixel-comparison images to support
the challenge. However, reference and grabbed
images must be generated on a system with identical
GPU, graphics and custom drivers and settings
to the submitted configuration being challenged.
- Maya
6.5
- The
benchmark must be run using Maya 6.5
- The
benchmark script must be run with the command
'mayaTest(3)' where '3' is the number of runs.
- The
submission must contain the results.txt submission
file as well as the scoring spreadsheet used
to calculate the result.
- The
directory structure of the submission must
be as follows:
.../company-name/system_1/maya65/results.txt
.../company-name/system_1/maya65/MayaResults.xls
etc...
- The submission
file must be named company_apc_maya65_vN.zip
where company is the member company or
organization name in lower case and vN is
the file version (e.g. ibm_apc_maya65_v0.zip.)
The initial submission is v0. Resubmitted files
must have the version number incremented.
- Pro/ENGINEER
Wildfire 2.0
- The
benchmark must be run using build “M160” of
Pro/ENGINEER Wildfire 2.0. Submissions may
be made with 32-bit or 64-bit versions of
M160. However, submissions made with 32-bit
Pro/ENGINEER Wildfire 2.0 on Windows XP 64-bit
Edition are prohibited.
- The
config.pro file must be used as-is and may
not be modified or overridden.
- The
script files “utilities\runbench.bat” or “utilities/runbench.csh” may
be modified as necessary to enable execution
of the benchmark on the system being tested.
If modified, the modified version must be
included in the benchmark submission.
- The
submission must contain the “proe_result.txt” file
(as generated by the “proescore” program)
as well as the corresponding “trail.txt” file
generated from running the benchmark. Both
of these files may be found in the "results" directory
after a successful run of the benchmark.
The first section of the “proe_result.txt” file
must be edited to reflect the system configuration.
- There
must be no license-related warnings in the “trail.txt” file.
- The
directory structure of the submission must
be as follows:
.../company-name/system_1/proewildfire2/proe_result.txt
.../company-name/system_1/proewildfire2/trail.txt
(may be compressed)
.../company-name/system_1/proewildfire2/runbench.bat
(or runbench.csh, if modified, as required by
rule c above)
.../company-name/system_2/proewildfire2/proe_result.txt
.../company-name/system_2/proewildfire2/trail.txt
(may be compressed)
.../company-name/system_2/proewildfire2/runbench.bat
(or runbench.csh, if modified, as required by
rule c above)
etc...
- The
submission file must be named company_apc_proewildfire2_vN.zip
or company_apc_proewildfire2_vN.tar.z
where company is the member company
or organization name in lower case and vN is
the file version (e.g. “sgi_apc_proewildfire2_v0.tar.z” and “intel_apc_proewildfire2_v0.zip”.)
The initial submission is v0. Resubmitted
files must have the version number incremented.
- UGS
NX3
- The benchmark must be run using build “3.0.3.2” of
Unigraphics NX3. Submissions are only accepted
on Windows XP Professional SP2 and later.
- The script file “%ProgramFiles%\SPECapc\nx3bench\nx3runbench.bat” may
be modified to change the directory where the
benchmark files are located on the system being
tested. If modified, the modified version must
be included in the benchmark submission.
- The application window must be maximized and all
toolbars and the resource bar must be turned
off.
- NX3's Fixed Frame Rate feature must be disabled.
- The system's booted configuration as specified
in the "boot.ini" file must contain
the "/3GB" flag.
- The chosen setting for View Frustum Culling must
be documented in the Comments field of the "results.txt" file.
- The "Disable Line Smooth" option under
Preferences>Visualization Performance>General
Settings must be unchecked.
- All other application settings should be default.
- The submission must contain the “nx3result.txt” file
as well as the corresponding “nx3result.xls” file
generated from running the benchmark. Both of
these files may be found in the "results" directory
after a successful run of the benchmark. The
first section of the “nx3bench.xls” file
must be edited to reflect the system configuration.
- The directory structure of the submission must
be as follows:
.../company-name/system_1/nx3bench/nx3result.txt
.../company-name/system_1/nx3bench/nx3bench.xls
.../company-name/system_1/nx3bench/result.txt
.../company-name/system_1/nx3bench/nx3bench.log
.../company-name/system_1/nx3bench/nx3runbench.bat,if
modified, as required by rule c above)
.../company-name/system_2/nx3bench/nx3result.txt
.../company-name/system_2/nx3bench/nx3bench.xls
.../company-name/system_2/nx3bench/result.txt
.../company-name/system_2/nx3bench/nx3bench.log
.../company-name/system_2/nx3bench/nx3runbench.bat,
if modified, as required by rule c above)
etc...
- The submission file must be named company_apc_nx3bench_vN.zip
or company_apc_nx3bench_vN.tar.z where company
is the member company or organization name in
lower case and vN is the file version (e.g. “sgi_apc_nx3bench_v0.tar.z” and “intel_apc_nx3bench_v0.zip”.)
The initial submission is v0. Resubmitted files
must have the version number incremented.
- 3ds
Max 8
- The
benchmark must be run using 3ds Max version 8,
service pack 3.
- To
ensure stability, the system must be booted with “/3GB” in
the relevant line of “boot.ini”.
- The
3ds Max 8 window must be maximized, filling the
screen. The window must not be occluded by any
other windows. Note: Task-bar auto-hide must
be enabled.
- Drivers
and custom drivers must be configured (explicitly
or implicitly) to use:
- GL_TEXTURE_MIN_FILTER
= GL_LINEAR_MIPMAP_LINEAR
- GL_TEXTURE_MAG_FILTER
= GL_LINEAR
- Textures
cannot be modified, e.g. reduced in size
or depth.
- Wireframe
objects MUST be drawn using triangle
strips. (Note: this differs from the
previous version of the benchmark.)
- The
following Viewport parameters must be checked
(i.e. enabled) in the Viewports Tab found in
the 3ds Max 8 Customize->Preferences menu
(Viewports tab):
- Backface
cull on object creation
- Mask
viewport to safe region
- Update
background while playing
- Display
world axis
- The
following Viewport parameters must NOT be checked
in the 3ds max 8 Customize->Preferences menu
(Viewports tab):
- Attenuate
lights
- Filter
environment backgrounds
- The
benchmark can be run using either OpenGL or Direct3D.
3ds Max 8 supports shaders only for Direct3D;
therefore, OpenGL submissions should contain
a Shaders score of "N/A".
- For
OpenGL submissions, an application graphics
driver or plug-in may be used, but the images
captured from the benchmark must not differ
from the supplied 3ds Max OpenGL driver's
images by more than 5%. If the application
driver or plug-in’s images do not match,
the submitter may apply for a waiver based
on documented errors in the 3ds max OpenGL
driver. The availability and support of an
application driver or plug-in shall not affect
the submission's “single supplier” or “parts
built” status.
- The
submission must contain the files “result.txt” and “result.xls”.
- The
directory structure of the submission must be
as follows:
.../company-name/system_1/3dsmax8/result.txt
.../company-name/system_1/3dsmax8/result.xls
.../company-name/system_2/3dsmax8/result.txt
.../company-name/system_2/3dsmax8/result.xls
etc...
- The
submission file must be named company_apc_3dsmax8_vN.zip
where company is the member company or
organization name in lower case and vN is
the file version (e.g. hp_apc_3dsmax8_v0.zip.)
The initial submission is v0. Resubmitted files
must have the version number incremented.
- Solid
Edge V19
- The
benchmark must be run using Solid
Edge V19.00.00.66 (SP0).
- The
application can be run with Tools->Options->View
Application Display set to Graphics Card
Driven (Basic or Advanced), Software Driven
or Backing Store. The choice must be documented
in the comments portion of the results.
- Application settings
must not be changed from the defaults set by
the benchmark installation. Settings that must
not be changed include, but are not limited to:
- culling
- wireframe
display in Move Part command
- arc smoothness
(3)
- The submission must
contain the file result.txt that is generated
during the benchmark run.
- The directory structure
of the submission must be as follows:
.../company-name/system_1/SEV19/result.txt
.../company-name/system_2/SEV19/result.txt
- The submission file
must be named company_apc_SolidEdgeV19_vN.zip
where company is the member company or
organization name in lower case and vN is
the file version (e.g. hp_apc_SolidEdgeV19_v0.zip.)
The initial submission is v0. Resubmitted files
must have the version number incremented.
- Solidworks 2007
- The
benchmark must be run using Solidworks2007 service
pack 0.
- The
application window size must not be changed from
its initial size.
- The
submission must contain a result.txt file generated
by running the benchmark. Its contents are extracted
from the file apcresultsN.txt, generated
by the benchmark, where N is the
number of the benchmark test run.
- The
submission will be derived from the best composite
generated by the default 5 benchmark runs, as
controlled and reported by the benchmark GUI.
- The
appearance of the the application's Quick Tips
/ Dynamic Help box, or other pop-ups that do
not dismiss themselves, will cause the benchmark
run to be invalid. The benchmark FAQ has information
on how to prevent this.
- The
directory structure of the submission must be
as follows:
.../company-name/system_1/sw2007/result.txt
.../company-name/system_2/sw2007/result.txt
etc...
- The
submission file must be named company_apc_sw2007_vN.zip
where company is the member company or
organization name in lower case and vN is
the file version (e.g. ibm_apc_sw2007_v0.zip.)
The initial submission is v0. Resubmitted files
must have the version number incremented.
|