The Application Performance
Characterization Project Committee Rules
Version 2.20
Last Updated: 12/11/2014
- Overview
- General
Philosophy
- All rules
declared in "The
Graphics and Workstation Performance Group (SPEC/GWPG): Rules For Project Groups" document (known
hereafter as the GWPG Project Groups Ruleset)
apply, unless specifically overruled by a rule in this document.
- General
Philosophy
- Within
SPEC's Graphics and Workstation Performance Group (SPEC/GWPG) there was
a strong belief that it is important to benchmark graphics performance
based on actual applications. Application-level benchmarks exist, but
they are not standardized and they do not cover a wide range of
application areas. Thus, the Application Performance Characterization (SPECapcSM)
Project was created within the GWPG to create a broad-ranging set
of standardized benchmarks for graphics-intensive applications.
- The SPECapc seeks to develop benchmarks for generating
accurate application-level graphics performance measures in an open,
accessible and well-publicized manner.
- The SPECapc wishes to contribute to the coherence of the
field of application performance measurement and evaluation so that
vendors will be better able to present well-defined performance measures
and customers will be better able to compare and evaluate vendors products and
environments.
- The SPECapc will provide formal beta benchmarks to
members and final benchmark releases to the public in a timely fashion.
- Hardware
and software used to run the SPECapc
benchmarks must provide a suitable environment for running typical (not
just benchmark) workloads for the applications in question.
- SPECapc reserves the right to adapt its benchmarks
as it deems necessary to preserve its goal of fair and useful
benchmarking (e.g. remove benchmark, modify benchmark code or data, etc). If a change is made to the suite, SPECapc will notify the appropriate parties (i.e. SPECapc members and users of the benchmark) and SPECapc will re-designate the benchmark by changing
its name and/or version. In the case that a benchmark is removed in
whole or in part, SPECapc reserves the right
to republish in summary form "adapted" results for previously
published systems, converted to the new metric. In the case of other
changes, such a republication may necessitate re-testing and may require
support from the original test sponsor.
- Overview
of Optimizations
- SPECapc is aware of the importance of optimizations
in producing the best system performance. SPECapc
is also aware that it is sometimes hard to draw an exact line between
legitimate optimizations that happen to benefit SPECapc
benchmarks and optimizations that specifically target SPECapc benchmarks. However, with the list below, SPECapc wants to increase awareness of implementers
and end-users to issues of unwanted benchmark-specific optimizations
that would be incompatible with SPECapc's goal
of fair benchmarking.
- To ensure
that results are relevant to end-users, SPECapc
expects that the hardware and software implementations used for running SPECapc benchmarks adhere to a set of general rules
for optimizations.
- General
Rules for Optimization
- Optimizations
must generate correct images and results for the application under test,
for both the benchmark case and similar cases. Correct images and
results are those deemed by the majority of the SPECapc
electorate, potentially with input from the associated independent
software vendor (ISV) and/or end-users, to be sufficiently adherent to
the intent behind the application.
- Optimizations
must improve performance for a class of workloads where the class of
workloads must be larger than a single SPECapc
benchmark or SPECapc
benchmark suite.
- For
any given optimization a system should generate correct images with and
without said optimization. An optimization should not reduce system
stability.
- The
vendor encourages the implementation for general use (not just for
running a single SPECapc benchmark or SPECapc benchmark suite).
- The
implementation is generally available, documented and supported by the
providing vendor.
- In
the case where it appears that the above guidelines have not been
followed, SPECapc may investigate such a claim
and request that the optimization in question (e.g. one using SPECapc benchmark-specific pattern matching) be
removed and the results resubmitted. Or, SPECapc
may request that the vendor correct the deficiency (e.g. make the
optimization more general purpose or correct problems with image
generation) before submitting results based on the optimization.
- It
is expected that system vendors would endorse the general use of these
optimizations by customers who seek to achieve good application
performance.
- No
pre-computed (e.g. driver-cached) images, geometric data, or state may
be substituted within an SPECapc
benchmark on the basis of detecting that said benchmark is running (e.g.
pattern matching of command stream or recognition of benchmark's name).
- Benchmarks
- Benchmark
Acceptance
- Benchmark
components are defined as
- specific
revision of an application,
- run rules, scripts and associated data sets.
- New
or modified benchmark components require a 2/3-majority vote to be
accepted for publication. Selection of datecode
versions of a specific revision of an application is by majority vote.
- A
minimum 3-week review period is required for new or significantly
modified benchmark components.
- At
the end of the review period a vote will be called to approve the
proposed changes.
- An
amendment to a benchmark component during the review period must be
unanimously accepted. If not, the review period shall be restarted.
- Benchmark
Code Versioning
- Benchmarks
use the following version coding: M.m (e.g. SPECapcSM for Pro/ENGINEER 20.0 v1.1) M
is the major release number and m is the minor release number.
- The
major release number is only incremented when large amounts of code are
changed and the scripting language is dramatically changed as a result
-- backward compatibility is highly unlikely when moving scripts or data
sets between major releases (e.g. running v2 scripts on a v3 executable
would almost certainly fail).
- The
minor release number is bumped if some small set of code is replaced or
removed - but the standard, unchanged scripts and data sets, as a whole,
must run on the new version (but perhaps with different performance).
- When
there is a new major release of a benchmark, submissions using the
previous release will be accepted for at least one submission cycle. When
a superceded benchmark is retired, the
associated run-rules will be archived in an "archived benchmark
run-rules document" posted on the Previous SPECapc
Benchmarks web-page.
- Submission,
Review and Publication
- General
Benchmark Run Rules
- The
system under test must correctly perform all of the operations being
requested by the application during the benchmark.
- No
changes to any files associated with the benchmark are permitted
excepted as noted in the benchmark-specific rules (section 5 of this
document).
- The
entire display raster must be available for use by the application being
benchmarked.
- It
is not permissible to override the intended behavior of the application
through any means including, but not limited to, registry settings or
environment variables.
- No
interaction is allowed with the system under test during the benchmark,
unless required by the benchmark.
- The
system under test can not skip frames during
the benchmark run.
- It
is not permissible to change the system configuration during the running
of a given benchmark. That is, one can't power off the system, make some
changes, then power back on and run the rest of the benchmark.
- Results
submitted must be obtained using the scripts, models, and application
revisions which are specified for that submission cycle by the SPECapc.
- The
color depth used must be at least 24 bits (true color), with at least 8
bits of red, 8 bits of green and 8 bits of blue, unless otherwise
specified.
- The
display raster resolution must be at least 1280 pixels by 1024 pixels,
unless otherwise specified.
- The
monitor refresh rate must be at least 75Hz. This requirement does not
apply to digital flat panel displays, unless otherwise specified.
- The
border width of the windows created during the benchmark shall not exceed
10 pixels, unless otherwise specified.
- The
monitor used in the benchmark must support the stated resolution and
refresh rate.
- The
benchmark must successfully obtain all requested window sizes, with no
reduction or clipping of any benchmark-related windows. Windows
created by the benchmark must not be obscured on the screen by anything
other than other elements created by the benchmark.
- Tests
may be run with or without a desktop/window manager if the application
allows this, but must be run on some native windowing system.
- It is not
permissible to interact with the system under test to influence the
progress of the benchmark during execution.
- General
Submission Content Rules
- The
submission upload file structures are defined in the benchmark-specific
section below.
- Submission
Process Rules
- The
submission file names are detailed below under the benchmark-specific
rules.
- Review
Period Rules
- Reviewers
will decide if the image quality and results of the submission are sufficiently
correct with respect to the intent of the ISV to satisfy the intended
end-users' expectations.
- SPECapc Benchmark Specific Rules and
Procedures
- Autodesk
3ds max 2015
- The
benchmark must be run using Autodesk 3ds Max 2015 with Service Pack 1
(SP1) applied. Do not install the 3ds Max Subscription Advantage Pack,
as it will interfere with the operation and results of the benchmark.
- The
default 3ds Max 2015 application settings must be used.
- The
benchmark may only be run using Nitrous DX11 display driver.
- The
display resolution must be at least 1920 pixels by 1080 pixels. If the
system has an integrated display which cannot achieve 1920x1080
resolution, e.g., a notebook, the system’s maximum possible resolution
must be used.
- Submissions
can optionally be made at 4k resolutions.
- Submissions
can optionally be made with AA modes greater than the default 0.
- This
benchmark is only supported on systems running Microsoft Windows Win7
64-bit operating system.
- The
3dsmax 2015 benchmark should be run with at least 16GB of system memory
installed.
- The
application window must be fully visible and not be occluded by any
other windows. Task-bar auto-hide must be enabled. Any windows on top of
the application rendering window may interfere with the performance. The
benchmark status window is an exception to this rule as it should not
intersect the graphics rendering window.
- The
results.txt file must be generated by the GwpgSystemInfo.exe found in the ./submissions directory.
- After
completion of the benchmark the submitter should create a populated
specAPC2015.xlsm and results.txt files by following these steps:
- Copy <3dsmax 2015
install directory>/maxtest/submissions/GwpgSystemInfo.exe
to the newly created TestResults folder.
- Copy <3dsmax 2015
install directory>/maxtest/submissions/specAPC2015.xlsm
to the newly created TestResults folder.
- Open specAPC2015.xlsm
with Office Excel 2007 or greater. In the spreadsheet; press
Enable Content and Select Test Results Folder buttons.
- Create a results.txt by
running GwpgSystemInfo.exe
- Verify all prepopulated
values in results.txt.
- Copy the scores from
specAPC2015.xlsm Overview sheet into the results.txt file.
- Add appropriate values
to the Submitter fields.
- The
directory structure of the submission must be as follows:
.../company-name/system_1/3dsmax2015/result.txt
.../company-name/system_1/3dsmax2015/results1.xml
.../company-name/system_1/3dsmax2015/results2.xml
.../company-name/system_1/3dsmax2015/results3.xml
.../company-name/system_1/3dsmax2015/testrenders/*.jpeg
.../company-name/system_1/3dsmax2015/specAPC2015.xlsm
.../company-name/system_1/3dsmax2015/result.txt
.../company-name/system_2/3dsmax2015/results1.xml
.../company-name/system_2/3dsmax2015/results2.xml
.../company-name/system_2/3dsmax2015/results3.xml
.../company-name/system_2/3dsmax2015/testrenders/*.jpeg
.../company-name/system_2/3dsmax2015/specAPC2015.xlsm
etc....
- The
submission file must be named company_apc_3dsmax2015_vN.zip
where company is the member company or organization name in lower case
and vN is the file version (e.g.
dell_apc_3dsmax2015_v0.zip.) The initial submission is v0. Resubmitted
files must have the version number incremented.
- Siemens
PLM NX 8.5
- The
benchmark must be run using build Maintenance Release 8.5.1.3 of
Siemens PLM NX 8.5.
- The
NX 8.5 benchmark is only supported on systems running Microsoft Windows
7 64-bit operating system.
- The
display resolution must be at least 1920 pixels by 1080 pixels.
If the system has an integrated display which cannot achieve 1920x1080
resolution, then the system’s maximum possible resolution must be used.
Laptops, notebooks, and ultrabooks would be
examples of this type of system.
- The
batch file run_SPECapcNX85bmk.bat may only be modified to
change the location of the benchmark folder. If modified, the modified
version must be included in the benchmark submission.
- The
NX 8.5 application graphics window must be maximized, all toolbars
disabled, and history resource bar unpinned.
- The
NX 8.5 "Fixed Frame Rate" setting must be globally disabled.
- The
NX 8.5 "Translucency" setting is optional and should be
documented in the Comments field of the "Result.txt"
file.
- All
other NX 8.5 application settings should be the default values. See the
document SPECapc_NX8.5_BenchmarkSetup for the full details of
the benchmark setup.
- If
the NX 8.5 graphics visualization registry setting is altered, this must
be documented in the Comments field of the "Result.txt" file.
- Only
submissions run on Microsoft Windows 7 64-bit will be accepted for
review.
- The
submission must contain the "ResultsSummary.htm", "Result.txt",
"CompeteStats.csv", and "timer_data01-04.txt"
files generated from running the benchmark. These files may be found in
the "Results_DateTime" directory
after a successful run of the benchmark. The first section of the
"Result.txt" file should be edited to reflect the submitting
organization's name and URL and any relevant comments such as those
suggested above.
- The
directory structure of the zip file containing the SPECapc
NX 8.5 submission data must be as follows:
.../company-name/system_1/nx85/ResultsSummary.htm
.../company-name/system_1/nx85/gwpg_config.htm
.../company-name/system_1/nx85/Result.txt
.../company-name/system_1/nx85/CompleteStats.csv
.../company-name/system_1/nx85/timer_data01.txt
.../company-name/system_1/nx85/timer_data02.txt
.../company-name/system_1/nx85/timer_data03.txt
.../company-name/system_1/nx85/timer_data04.txt
.../company-name/system_1/nx6/timer_data05.txt
.../company-name/system_1/nx85/run_SPECapcnx85bmk.bat (if
modified, as required by rule [d] above))
.../company-name/system_2/nx85/ResultsSummary.htm
.../company-name/system_2/nx85/gwpg_config.htm
.../company-name/system_2/nx85/Result.txt
.../company-name/system_2/nx85/CompleteStats.csv
.../company-name/system_2/nx85/timer_data01.txt
.../company-name/system_2/nx85/timer_data02.txt
.../company-name/system_2/nx85/timer_data03.txt
.../company-name/system_2/nx85/timer_data04.txt
.../company-name/system_2/nx6/timer_data05.txt
.../company-name/system_2/nx85/run_SPECapcnx85bmk.bat (if
modified, as required by rule [d] above))
etc...
- The
submission file must be named company-name_apc_nx85_vN.zip
where company-name is the member company or organization name in lower
case and vN is the file version (e.g.
hp_apc_nx85_v0.zip.) The initial submission is v0. Resubmitted files
must have the version number incremented.
- The
submission file must be named company-name_apc_creo2_vN.zip where
company-name is the member company or organization name in lower case
and vN is the file version (e.g.
dell_apc_creo2_v0.zip.) The initial submission is v0. Resubmitted files
must have the version number incremented.
- Autodesk
Maya 2012
- The
benchmark must be run using Autodesk Maya 2012 Service Pack 2.
- The
Maya 2012 benchmark is only supported on systems running Microsoft
Windows 7 64-bit operating system.
- Only
submissions run on Microsoft Windows 7 64-bit will be accepted for
review.
- The
display resolution must be at least 1920 pixels by 1080 pixels. If the
system has an integrated display which cannot achieve 1920x1080
resolution, e.g., a notebook, the system's maximum possible resolution
must be used.
- The
submission must contain the files result.txt and maya2012APC.xls.
- The
directory structure of the submission must be as follows:
.../company-name/system_1/maya2012/result.txt
.../company-name/system_1/maya2012/maya2012APC.xls
.../company-name/system_2/maya2012/result.txt
.../company-name/system_2/maya2012/maya2012APC.xls
etc...
- The
submission file must be named company_apc_maya2012_vN.zip where company
is the member company or organization name in lower case and vN is the file version (e.g.
fjs_apc_maya2012_v0.zip.) The initial submission is v0. Resubmitted
files must have the version number incremented.
- Dassault Solidworks
2013
- The
benchmark must be run using Dassault
SolidWorks 2013 Service Pack 1.
- The
SolidWorks 2013 benchmark is only supported on systems running Microsoft
Windows 7 64-bit operating system.
- Only
submissions run on Microsoft Windows 7 64-bit operating system and with
graphics cards fully supporting SolidWorks features RealView
and Ambient Occlusion will be accepted for review.
- The
graphics card used must fully support SolidWorks features RealView and Ambient Occlusion for comparable
results.
- The
display resolution must be at least 1920 pixels by 1080 pixels. If the
system has an integrated display which cannot achieve 1920x1080
resolution, e.g., a notebook, the system's maximum possible resolution
must be used.
- It
is recommended that the SolidWorks 2013 benchmark be run with at least
8GB of RAM installed.
- The RunSW2013Bench.bat,
SW2013Bench.dat and model files must be used as-is and may not be
modified or overridden.
- The config.txt
file must be edited to reflect the system configuration before running
the benchmark.
- The
benchmark must be run using the default application settings, including
those for Photoview 360.
- It
is recommended that the system be rebooted before running the benchmark,
and that nothing obscures the SolidWorks window, including taskbar,
application pop-ups, and other applications. Refrain from interacting
with the system while the benchmark is running.
- The
submission must contain the result.txt, SW2013BenchDetails.csv, and
SW2013BenchLog.txt files. These files may be found in the results
directory after a successful run of the benchmark.
- The
directory structure of the submission must be as follows:
.../company-name/system_1/solidworks2013/result.txt
.../company-name/system_1/solidworks2013/SW2013BenchDetails.csv
.../company-name/system_1/solidworks2013/SW2013BenchLog.txt
.../company-name/system_2/solidworks2013/result.txt
.../company-name/system_2/solidworks2013/SW2013BenchDetails.csv
.../company-name/system_2/solidworks2013/SW2013BenchLog.txt
etc...
- The
submission file must be named company_apc_solidworks2013_vN.zip where
company is the member company or organization name in lower case and vN is the file version (e.g. hp_apc_maya2012_v0.zip.)
The initial submission is v0. Resubmitted files must have the version
number incremented.
- PTC
Creo 3.0
- The
benchmark must be run using Creo 3.0 build
M010.
- The Creo 3.0 benchmark is only supported on systems
running Microsoft Windows 7 64-bit operating system.
- Only
submissions run on Microsoft Windows 7 64-bit operating system and with
graphics cards supporting a minimum of 8X MSAA and Order Independent
Transparency will be accepted for review.
- The
display resolution must be at least 1920 pixels by 1080 pixels. If the
system has an integrated display which cannot achieve 1920x1080
resolution, e.g., a notebook, the system's maximum possible resolution
must be used.
- It
is recommended that the Creo 3.0 benchmark be
run with at least 8GB of RAM installed.
- The All_V25.txt,
config.pro and apc.dat files must be used as-is and may
not be modified or overridden.
- The
benchmark may not use any other config.pro, config.sup,
config.win, menu_def.pro,
or protk.dat files. These files must be removed from the application
install location and user folder before running the benchmark.
- The
graphics card used must support a minimum of 8X MSAA and Order
Independent Transparency for comparable results.
- The
script file utilities\runbench.bat may be modified to accommodate
an alternate install location for Creo 3.0. If
modified, the modified version of runbench.bat must be included
in the benchmark submission.
- The utilities\config.txt
file must be edited to reflect the system configuration before running
the benchmark.
- The
submission must contain the result.txt, trail.txt, score-steps.txt,
score.csv, apc.dat and *.tif
files. These files may be found in the results directory after a
successful run of the benchmark.
- There
must be no license-related warnings in the trail.txt file.
- The directory
structure of the submission must be as follows:
.../company-name/system_1/creo3/result.txt
.../company-name/system_1/creo3/trail.txt
…/company-name/system_1/creo3/score-steps.txt
…/company-name/system_1/creo3/score.csv
.../company-name/system_1/creo3/apc.dat
…/company-name/system_1/creo3/*.tif
.../company-name/system_1/creo3/runbench.bat (if modified, as
required by rule i above)
…/company-name/system_2/creo3/result.txt
.../company-name/system_2/creo3/trail.txt
…/company-name/system_2/creo3/score-steps.txt
…/company-name/system_2/creo3/score.csv
.../company-name/system_2/creo3/apc.dat
…/company-name/system_2/creo3/*.tif
.../company-name/system_2/creo3/runbench.bat (if modified, as
required by rule i above)
etc...
Adoption
v2.20: Adopted 11 Dec 2014 (adds Creo 3.0 and
deletes retired Creo 2.0 and Lightwave
9.6 benchmark)
v2.19: Adopted 28 July 2014 (adds 3ds Max 2015 and deletes retired 3ds Max 2011
benchmark)
v2.17: Adopted 20 February 2013 (adds SolidWorks 2013 and deletes retired
SolidWorks 2007 benchmark)
v2.16: Adopted 6 August 2012 (PTC Creo 2 rule a.
change, adds Maya 2012 and deletes retired Maya 2009 benchmark)
v2.15: Adopted 22 May 2012 (adds PTC Creo 2 and
deletes retired WF2 benchmark)
v2.14: Adopted 30 March 2012 (adds Siemens NX 6 and deletes retired benchmark)
v2.13: Adopted 30 June 2011 (adds Autodesk 3ds max 2011 rules and deletes
retired benchmarks)
v2.12: Adopted 24 June 2009 (adds Autodesk Maya 2009 rules)
v2.11: Adopted 12 March 2009 (adds Lightwave 9.6
rules, updates NX4 rules, adds non-interaction rule)
v2.10: Adopted 13 September 2007 (reflects GPC-to-GWPG transition, and adds NX4
rules)
v2.06: Adopted 13 July 2007
v2.05: Adopted 18 April 2007
v2.04: Adopted 06 February 2007
v2.03: Adopted 20 July 2006
v2.02: Adopted 27 April 2006 for 01 May 2006 publication
v2.01: Adopted 26 January 2006
v2.00: Adopted 26 January 2006