Results
SPECapc
Download/Order
Resources
|
The Application Performance Characterization Project Committee Rules
Version 2.13a
Last Updated:
07/01/2011
- Overview
- General
Philosophy
- All
rules declared in "The Graphics
and Workstation Performance Group (SPEC/GWPG): Rules For Project
Groups" document (known hereafter as the GWPG Project
Groups Ruleset) apply, unless specifically overruled by a rule in
this document.
- General
Philosophy
- Within
SPEC's Graphics and Workstation Performance Group (SPEC/GWPG)
there was a strong belief that it is important to benchmark
graphics performance based on actual applications.
Application-level benchmarks exist, but they are not standardized
and they do not cover a wide range of application areas. Thus, the
Application Performance Characterization (SPECapcSM)
Project was created within the GWPG to create a broad-ranging set
of standardized benchmarks for graphics-intensive applications.
- The
SPECapc seeks to develop benchmarks for generating accurate
application-level graphics performance measures in an open,
accessible and well-publicized manner.
- The
SPECapc wishes to contribute to the coherence of the field of
application performance measurement and evaluation so that vendors
will be better able to present well-defined performance measures
and customers will be better able to compare and evaluate vendors
products and environments.
- The
SPECapc will provide formal beta benchmarks to members and final
benchmark releases to the public in a timely fashion.
- Hardware
and software used to run the SPECapc benchmarks must provide a
suitable environment for running typical (not just benchmark)
workloads for the applications in question.
- SPECapc
reserves the right to adapt its benchmarks as it deems necessary
to preserve its goal of fair and useful benchmarking (e.g. remove
benchmark, modify benchmark code or data, etc). If a change is
made to the suite, SPECapc will notify the appropriate parties
(i.e. SPECapc members and users of the benchmark) and SPECapc will
re-designate the benchmark by changing its name and/or version. In
the case that a benchmark is removed in whole or in part, SPECapc
reserves the right to republish in summary form "adapted"
results for previously published systems, converted to the new
metric. In the case of other changes, such a republication may
necessitate re-testing and may require support from the original
test sponsor.
- Overview
of Optimizations
- SPECapc
is aware of the importance of optimizations in producing the best
system performance. SPECapc is also aware that it is sometimes
hard to draw an exact line between legitimate optimizations that
happen to benefit SPECapc benchmarks and optimizations that
specifically target SPECapc benchmarks. However, with the list
below, SPECapc wants to increase awareness of implementers and
end-users to issues of unwanted benchmark-specific optimizations
that would be incompatible with SPECapc's goal of fair
benchmarking.
- To
ensure that results are relevant to end-users, SPECapc expects
that the hardware and software implementations used for running
SPECapc benchmarks adhere to a set of general rules for
optimizations.
- General
Rules for Optimization
- Optimizations
must generate correct images and results for the application under
test, for both the benchmark case and similar cases. Correct
images and results are those deemed by the majority of the SPECapc
electorate, potentially with input from the associated independent
software vendor (ISV) and/or end-users, to be sufficiently
adherent to the intent behind the application.
- Optimizations
must improve performance for a class of workloads where the class
of workloads must be larger than a single SPECapc benchmark or
SPECapc benchmark suite.
- For any
given optimization a system should generate correct images with
and without said optimization. An optimization should not reduce
system stability.
- The
vendor encourages the implementation for general use (not just for
running a single SPECapc benchmark or SPECapc benchmark suite).
- The
implementation is generally available, documented and supported by
the providing vendor.
- In the
case where it appears that the above guidelines have not been
followed, SPECapc may investigate such a claim and request that
the optimization in question (e.g. one using SPECapc
benchmark-specific pattern matching) be removed and the results
resubmitted. Or, SPECapc may request that the vendor correct the
deficiency (e.g. make the optimization more general purpose or
correct problems with image generation) before submitting results
based on the optimization.
- It is
expected that system vendors would endorse the general use of
these optimizations by customers who seek to achieve good
application performance.
- No
pre-computed (e.g. driver-cached) images, geometric data, or state
may be substituted within an SPECapc benchmark on the basis of
detecting that said benchmark is running (e.g. pattern matching of
command stream or recognition of benchmark's name).
- Benchmarks
- Benchmark
Acceptance
- Benchmark
components are defined as
- specific
revision of an application,
- run
rules, scripts and associated data sets.
- New or
modified benchmark components require a 2/3-majority vote to be
accepted for publication. Selection of datecode versions of a
specific revision of an application is by majority vote.
- A
minimum 3-week review period is required for new or significantly
modified benchmark components.
- At the
end of the review period a vote will be called to approve the
proposed changes.
- An
amendment to a benchmark component during the review period must
be unanimously accepted. If not, the review period shall be
restarted.
- Benchmark
Code Versioning
- Benchmarks
use the following version coding: M.m (e.g. SPECapcSM for Pro/ENGINEER 20.0 v1.1) M is the major release number and m is
the minor release number.
- The
major release number is only incremented when large amounts of
code are changed and the scripting language is dramatically
changed as a result -- backward compatibility is highly unlikely
when moving scripts or data sets between major releases (e.g.
running v2 scripts on a v3 executable would almost certainly
fail).
- The
minor release number is bumped if some small set of code is
replaced or removed - but the standard, unchanged scripts and data
sets, as a whole, must run on the new version (but perhaps with
different performance).
- When
there is a new major release of a benchmark, submissions using the
previous release will be accepted for at least one submission
cycle. When a superceded benchmark is retired, the associated
run-rules will be archived in an "archived benchmark
run-rules document" posted on the Previous SPECapc Benchmarks
web-page.
- Submission,
Review and Publication
- General
Benchmark Run Rules
- The
system under test must correctly perform all of the operations
being requested by the application during the benchmark.
- No
changes to any files associated with the benchmark are permitted
excepted as noted in the benchmark-specific rules (section 5 of
this document).
- The
entire display raster must be available for use by the application
being benchmarked.
- It is
not permissible to override the intended behavior of the
application through any means including, but not limited to,
registry settings or environment variables.
- No
interaction is allowed with the system under test during the
benchmark, unless required by the benchmark.
- The
system under test can not skip frames during the benchmark run.
- It is
not permissible to change the system configuration during the
running of a given benchmark. That is, one can't power off the
system, make some changes, then power back on and run the rest of
the benchmark.
- Results
submitted must be obtained using the scripts, models, and
application revisions which are specified for that submission
cycle by the SPECapc.
-
The color depth used must be at least 24 bits (true color), with
at least 8 bits of red, 8 bits of green and 8 bits of blue, unless
otherwise specified.
-
The display raster resolution must be at least 1280 pixels by 1024
pixels, unless otherwise specified.
-
The monitor refresh rate must be at least 75Hz. This requirement
does not apply to digital flat panel displays, unless otherwise
specified.
- The border width of the
windows created during the benchmark shall not exceed 10 pixels,
unless otherwise specified.
- The
monitor used in the benchmark must support the stated resolution
and refresh rate.
- The
benchmark must successfully obtain all requested window sizes,
with no reduction or clipping of any benchmark-related windows.
Windows created by the benchmark must not be obscured on the
screen by anything other than other elements created by the
benchmark.
- Tests
may be run with or without a desktop/window manager if the
application allows this, but must be run on some native windowing
system.
- It is
not permissible to interact with the system under test to infuence
the progress of the benchmark during execution.
- General
Submission Content Rules
- The
submission upload file structures are defined in the
benchmark-specific section below.
- Submission
Process Rules
- The
submission file names are detailed below under the
benchmark-specific rules.
- Review
Period Rules
- Reviewers
will decide if the image quality and results of the submission are
sufficiently correct with respect to the intent of the ISV to
satisfy the intended end-users' expectations.
- SPECapc
Benchmark Specific Rules and Procedures
- Pro/ENGINEER
Wildfire 2.0
- The
benchmark must be run using build “M160” of
Pro/ENGINEER Wildfire 2.0. Submissions may be made with 32-bit or
64-bit versions of M160. However, submissions made with 32-bit
Pro/ENGINEER Wildfire 2.0 on Windows XP 64-bit Edition are
prohibited.
- The
config.pro file must be used as-is and may not be modified or
overridden.
- The
script files “utilities\runbench.bat” or
“utilities/runbench.csh” may be modified as necessary
to enable execution of the benchmark on the system being tested.
If modified, the modified version must be included in the
benchmark submission.
- The
submission must contain the “proe_result.txt” file (as
generated by the “proescore” program) as well as the
corresponding “trail.txt” file generated from running
the benchmark. Both of these files may be found in the "results"
directory after a successful run of the benchmark. The first
section of the “proe_result.txt” file must be edited
to reflect the system configuration.
- There
must be no license-related warnings in the “trail.txt”
file.
- The
directory structure of the submission must be as follows:
.../company-name/system_1/proewildfire2/proe_result.txt
.../company-name/system_1/proewildfire2/trail.txt
(may be
compressed)
.../company-name/system_1/proewildfire2/runbench.bat
(or runbench.csh, if modified, as required by rule c
above)
.../company-name/system_2/proewildfire2/proe_result.txt
.../company-name/system_2/proewildfire2/trail.txt
(may be
compressed)
.../company-name/system_2/proewildfire2/runbench.bat
(or runbench.csh, if modified, as required by rule c above)
etc...
- The
submission file must be named company_apc_proewildfire2_vN.zip
or company_apc_proewildfire2_vN.tar.z where company is the member company or organization name in lower case and vN is the file version (e.g. “sgi_apc_proewildfire2_v0.tar.z”
and “intel_apc_proewildfire2_v0.zip”.) The initial
submission is v0. Resubmitted files must have the version number
incremented.
- Solid
Edge V19
- The benchmark must be run using Solid Edge
V19.00.00.66 (SP0).
- The
application can be run with Tools->Options->View Application
Display set to Graphics Card Driven (Basic or Advanced), Software
Driven or Backing Store. The choice must be documented in the
comments portion of the results.
- Application
settings must not be changed from the defaults set by the
benchmark installation. Settings that must not be changed include,
but are not limited to:
- culling
- wireframe
display in Move Part command
- arc
smoothness (3)
- The
submission must contain the file result.txt that is generated
during the benchmark run.
- The
directory structure of the submission must be as
follows:
.../company-name/system_1/SEV19/result.txt
.../company-name/system_2/SEV19/result.txt
- The
submission file must be named company_apc_SolidEdgeV19_vN.zip
where company is the member company or organization name in
lower case and vN is the file version (e.g.
hp_apc_SolidEdgeV19_v0.zip.) The initial submission is v0.
Resubmitted files must have the version number incremented.
- Solidworks
2007
- The
benchmark must be run using Solidworks2007 service pack 0.
- The
application window size must not be changed from its initial size.
- The
submission must contain a result.txt file generated by running the
benchmark. Its contents are extracted from the file
apcresultsN.txt, generated by the benchmark, where N is the number of the benchmark test run.
- The
submission will be derived from the best composite generated by
the default 5 benchmark runs, as controlled and reported by the
benchmark GUI.
- The
appearance of the the application's Quick Tips / Dynamic Help box,
or other pop-ups that do not dismiss themselves, will cause the
benchmark run to be invalid. The benchmark FAQ has information on
how to prevent this.
- The
directory structure of the submission must be as
follows:
.../company-name/system_1/sw2007/result.txt
.../company-name/system_2/sw2007/result.txt
etc...
- The
submission file must be named company_apc_sw2007_vN.zip
where company is the member company or organization
name in lower case and vN is the file version (e.g.
ibm_apc_sw2007_v0.zip.) The initial submission is v0. Resubmitted
files must have the version number incremented.
- 3ds
Max 9
- The benchmark must be run using Autodesk 3ds Max version 9 with
service pack SP2 applied.
- To
ensure stability, the system must be booted with “/3GB”
in the relevant line of “boot.ini”.
- The 3ds
Max 9 window must be maximized, filling the screen. The window
must not be occluded by any other windows. Note: Task-bar
auto-hide must be enabled.
- Drivers
and custom drivers must be configured (explicitly or implicitly)
to use:
- GL_TEXTURE_MIN_FILTER = GL_LINEAR_MIPMAP_LINEAR
- GL_TEXTURE_MAG_FILTER
= GL_LINEAR
- Textures
cannot be modified, e.g. reduced in size or depth.
- Wireframe
objects must be drawn using triangle strips.
- The
following Viewport parameters must be checked (i.e. enabled) in
the Viewports Tab found in the 3ds Max 9 Customize->Preferences
menu (Viewports tab):
- Backface cull on object creation
- Mask
viewport to safe region
- Update
background while playing
- Display
world axis
- The
following Viewport parameters must NOT be checked in the 3ds max 9
Customize->Preferences menu (Viewports tab):
- Attenuate lights
- Filter
environment backgrounds
- The
benchmark can be run using either OpenGL or DirectX. 3ds Max 9
supports shaders only for DirectX; therefore, OpenGL submissions
should contain a Shaders score of "N/A".
- For
either OpenGL or DirectX submissions an application graphics
driver or plug-in may be used, but the images captured from the
benchmark must not differ from 3ds max's respective native OpenGL
or DirectX driver's images by more than 5%. If the application
driver's or plug-in’s images do not match, the submitter may
apply for a waiver based on documented errors in the respective
3ds max native driver. The availability and support of an
application driver or plug-in shall not affect the submission's
“single supplier” or “parts built” status.
- The
submission must contain the files “result.txt” and
“result.xls”.
- The
directory structure of the submission must be as
follows:
.../company-name/system_1/3dsmax9/result.txt
.../company-name/system_1/3dsmax9/result.xls
.../company-name/system_2/3dsmax9/result.txt
.../company-name/system_2/3dsmax9/result.xls
etc...
- The
submission file must be named company_apc_3dsmax9_vN.zip
where company is the member company or organization name in
lower case and vN is the file version (e.g.
hp_apc_3dsmax9_v0.zip.) The initial submission is v0. Resubmitted
files must have the version number incremented.
- UGS
NX4
- The benchmark must be run using build “Maintenance
Release 3” of Unigraphics NX4. Submissions are accepted on
Windows XP Professional SP2 and later, and on Windows XP 64-bit
Edition.
- The script file
“%ProgramFiles%\SPECapc\nx4mark\nx4mark.bat” may be
modified to change the directory where the benchmark files are
located on the system being tested. If modified, the modified
version must be included in the benchmark submission.
- The application window must be maximized and all toolbars
and the resource bar must be turned off.
- NX4's Fixed Frame Rate feature must be disabled.
- For 32-bit Windows, the system's booted configuration as
specified in the "boot.ini" file must contain the "/3GB"
flag.
- The chosen settings for "View Frustum Culling" and "Disable Translucency" must be
documented in the Comments field of the "result.txt"
file.
- The "Disable Line Smooth" option under
Preferences>Visualization Performance>General Settings must
be unchecked.
- All other application settings should be default.
- The submission must contain the “nx4result.txt”
file as well as the corresponding “nx4result.xls” file
generated from running the benchmark. Both of these files may be
found in the "results" directory after a successful run
of the benchmark. The first section of the “nx4bench.txt”
file must be edited to reflect the system configuration.
- The directory structure of the submission must be as
follows:
.../company-name/system_1/nx4mark/nx4mark_results.xls
.../company-name/system_1/nx4mark/result.txt
.../company-name/system_1/nx4mark/nx4mark.bat,
if modified, as required by rule c above)
.../company-name/system_1/nx4mark/nx4mark_results.xls
.../company-name/system_1/nx4mark/result.txt
.../company-name/system_1/nx4mark/nx4mark.bat,
if modified, as required by rule c above)
etc...
- The submission file must be named
company_apc_nx4mark_vN.zip or company_apc_nx4mark_vN.tar.z where
company is the member company or organization name in lower case
and vN is the file version (e.g. “sun_apc_nx4mark_v0.tar.z”
and “intel_apc_nx4mark_v0.zip”.) The initial
submission is v0. Resubmitted files must have the version number
incremented.
- NewTek Lightwave 3D 9.6
- The benchmark must be run using NewTek LightWave 3D version 9.6.
- The display raster resultion must be at least 1920 pixels by
1200 pixels, unless the system has an integrated display which cannot
achieve this resolution (example: a notebook), in which case the system?s
maximum possible resolution must be used.
- Submissions run on Microsoft Windows will be accepted for review. Submissions
run on Apple Mac OS X will not be accepted for review at this time.
- The submission must contain the files ?result.txt? and ?result.xls?.
- The directory structure of the submission must be as follows:
.../company-name/system_1/lightwave96/result.txt
.../company-name/system_1/lightwave96/result.xls
.../company-name/system_2/lightwave96/result.txt
.../company-name/system_2/lightwave96/result.xls
etc...
- The submission file must be named company_apc_lightwave96_vN.zip where company is the member company or organization name in lower case and vN is the
file version (e.g. dell_apc_lightwave96_v0.zip.) The initial submission is v0.
Resubmitted files must have the version number incremented.
- Autodesk Maya 2009
- The benchmark must be run using Autodesk Maya 2009 Service Pack 1a.
- Systems running 32-bit Microsoft Windows operating systems must be booted with the /3GB flag set.
- The display raster resolution must be at least 1920 pixels by 1200 pixels, unless the system has an integrated display which cannot achieve this resolution (example: a notebook), in which case the system?s maximum possible resolution must be used.
- At this time, submissions run on Microsoft Windows will be accepted for review.
- The submission must contain the files ?result.txt? and ?result.xls?.
- The directory structure of the submission must be as follows:
.../company-name/system_1/maya2009/result.txt
.../company-name/system_1/maya2009/result.xls
.../company-name/system_2/maya2009/result.txt
.../company-name/system_2/maya2009/result.xls
etc...
- The submission file must be named company_apc_maya2009_vN.zip where company is the member company or organization name in lower case and vN is the file version (e.g. dell_apc_lightwave96_v0.zip.) The initial submission is v0. Resubmitted files must have the version number incremented.
- Autodesk 3ds max 2011
- The benchmark must be run using Autodesk 3ds Max 2011 with Service Pack 1 (SP1) applied. Do not install the 3ds Max Subscription Advantage Pack, as it will interfere with the operation and results of the benchmark.
- The default 3ds Max 2011 application settings must be used.
- The benchmark may only be run using DirectX.
- For DirectX submissions, an application graphics driver or plug-in may be used, but the images captured from the benchmark must not differ from 3ds max's respective native DirectX driver's images by more than 5%. If the application driver's or plug-in?s images do not match, the submitter may apply for a waiver based on documented errors in the respective 3ds max native driver. The availability and support of an application driver or plug-in shall not affect the submission's ?single supplier? or ?parts built? status.
- The display resolution must be at least 1920 pixels by 1080 pixels. If the system has an integrated display which cannot achieve 1920x1080 resolution, e.g., a notebook, the system?s maximum possible resolution must be used.
- It is suggested that the 3ds Max 2011 Professional benchmark be run with at least 16GB of RAM installed. It is recommended that the Personal version of the benchmark be run with 8GB of RAM installed.
- The 3ds Max Professional benchmark is only supported on systems running Microsoft Windows Win7 64-bit operating system.
- The 3ds Max Personal version of the benchmark is unsupported, but it is recommended for Win7 32-bit and 64-bit.
- The 3ds Max Personal version requires the 3GB flag for Win7 32-bit.
- Only submissions with the 3ds Max Professional version of the benchmark run on Microsoft Windows Win7-64 operating system will be accepted for review. For submissions, the official run of the benchmark must be executed, which includes 4 iterations of all tests in the benchmark.
- The 3ds Max 2011 window must be maximized, filling the screen. The window must not be occluded by any other windows. Task-bar auto-hide must be enabled. Any windows on top of the application may interfere with the performance.
- The submission must contain the files "results1.xml" through "results4.xml",
"specAPC.xlsm" and a completed "result.txt" file.
- The directory structure of the submission must be as follows:
.../company-name/system_1/3dsmax2011/result.txt
.../company-name/system_1/3dsmax2011/results1.xml
.../company-name/system_1/3dsmax2011/results2.xml
.../company-name/system_1/3dsmax2011/results3.xml
.../company-name/system_1/3dsmax2011/results4.xml
.../company-name/system_1/3dsmax2011/specAPC.xlsm
.../company-name/system_1/3dsmax2011/result.txt
.../company-name/system_2/3dsmax2011/results1.xml
.../company-name/system_2/3dsmax2011/results2.xml
.../company-name/system_2/3dsmax2011/results3.xml
.../company-name/system_2/3dsmax2011/results4.xml
.../company-name/system_2/3dsmax2011/specAPC.xlsm
etc...
- The submission file must be named company_apc_3dsmax2011_vN.zip where company is the member company or organization name in lower case and vN is the file version (e.g. dell_apc_3dsmax2011_v0.zip.) The initial submission is v0. Resubmitted files must have the version number incremented.
Adoption
v2.13a: Adopted 30 June 2011 (adds Autodesk 3ds max 2011 rules and deletes retired benchmarks)
v2.12: Adopted 24 June 2009 (adds Autodesk Maya 2009 rules)
v2.11: Adopted 12 March 2009 (adds Lightwave 9.6 rules, updates NX4 rules, adds non-interaction rule)
v2.10: Adopted 13 September 2007 (reflects GPC-to-GWPG transition, and adds NX4 rules)
v2.06:
Adopted 13 July 2007
v2.05: Adopted 18 April 2007
v2.04:
Adopted 06 February 2007
v2.03: Adopted 20 July 2006
v2.02:
Adopted 27 April 2006 for 01 May 2006 publication
v2.01:
Adopted 26 January 2006
v2.00: Adopted 26 January
2006
|