Results
SPECapc
Download/Order
Resources
|
The Application Performance Characterization Project
Committee Rules
Version 2.17
Last Updated: 2/20/2013
- Overview
- General Philosophy
- All rules declared in "The
Graphics and Workstation Performance Group (SPEC/GWPG): Rules For Project
Groups" document (known hereafter as the GWPG Project Groups Ruleset)
apply, unless specifically overruled by a rule in this document.
- General Philosophy
- Within SPEC's Graphics and Workstation
Performance Group (SPEC/GWPG) there was a strong belief that it is important to
benchmark graphics performance based on actual applications. Application-level
benchmarks exist, but they are not standardized and they do not cover a wide
range of application areas. Thus, the Application Performance Characterization
(SPECapcSM) Project was created within the GWPG to create a
broad-ranging set of standardized benchmarks for graphics-intensive
applications.
- The SPECapc seeks to develop benchmarks for
generating accurate application-level graphics performance measures in an open,
accessible and well-publicized manner.
- The SPECapc wishes to contribute to the
coherence of the field of application performance measurement and evaluation so
that vendors will be better able to present well-defined performance measures
and customers will be better able to compare and evaluate vendors products and
environments.
- The SPECapc will provide formal beta benchmarks
to members and final benchmark releases to the public in a timely fashion.
- Hardware and software used to run the SPECapc
benchmarks must provide a suitable environment for running typical (not just
benchmark) workloads for the applications in question.
- SPECapc reserves the right to adapt its
benchmarks as it deems necessary to preserve its goal of fair and useful
benchmarking (e.g. remove benchmark, modify benchmark code or data, etc). If a
change is made to the suite, SPECapc will notify the appropriate parties (i.e.
SPECapc members and users of the benchmark) and SPECapc will re-designate the
benchmark by changing its name and/or version. In the case that a benchmark is
removed in whole or in part, SPECapc reserves the right to republish in summary
form "adapted" results for previously published systems, converted to
the new metric. In the case of other changes, such a republication may
necessitate re-testing and may require support from the original test sponsor.
- Overview of Optimizations
- SPECapc is aware of the importance of
optimizations in producing the best system performance. SPECapc is also aware
that it is sometimes hard to draw an exact line between legitimate
optimizations that happen to benefit SPECapc benchmarks and optimizations that
specifically target SPECapc benchmarks. However, with the list below, SPECapc
wants to increase awareness of implementers and end-users to issues of unwanted
benchmark-specific optimizations that would be incompatible with SPECapc's goal
of fair benchmarking.
- To ensure that results are relevant to
end-users, SPECapc expects that the hardware and software implementations used
for running SPECapc benchmarks adhere to a set of general rules for
optimizations.
- General Rules for Optimization
- Optimizations must generate correct images and
results for the application under test, for both the benchmark case and similar
cases. Correct images and results are those deemed by the majority of the
SPECapc electorate, potentially with input from the associated independent
software vendor (ISV) and/or end-users, to be sufficiently adherent to the
intent behind the application.
- Optimizations must improve performance for a
class of workloads where the class of workloads must be larger than a single
SPECapc benchmark or SPECapc benchmark suite.
- For any given optimization a system should
generate correct images with and without said optimization. An optimization
should not reduce system stability.
- The vendor encourages the implementation for
general use (not just for running a single SPECapc benchmark or SPECapc
benchmark suite).
- The implementation is generally available,
documented and supported by the providing vendor.
- In the case where it appears that the above
guidelines have not been followed, SPECapc may investigate such a claim and
request that the optimization in question (e.g. one using SPECapc
benchmark-specific pattern matching) be removed and the results resubmitted.
Or, SPECapc may request that the vendor correct the deficiency (e.g. make the
optimization more general purpose or correct problems with image generation)
before submitting results based on the optimization.
- It is expected that system vendors would endorse
the general use of these optimizations by customers who seek to achieve good
application performance.
- No pre-computed (e.g. driver-cached) images, geometric
data, or state may be substituted within an SPECapc benchmark on the basis of
detecting that said benchmark is running (e.g. pattern matching of command
stream or recognition of benchmark's name).
- Benchmarks
- Benchmark Acceptance
- Benchmark components are defined as
- specific revision of an application,
- run rules, scripts and associated data
sets.
- New or modified benchmark components require a
2/3-majority vote to be accepted for publication. Selection of datecode
versions of a specific revision of an application is by majority vote.
- A minimum 3-week review period is required for
new or significantly modified benchmark components.
- At the end of the review period a vote will be
called to approve the proposed changes.
- An amendment to a benchmark component during the
review period must be unanimously accepted. If not, the review period shall be
restarted.
- Benchmark Code Versioning
- Benchmarks use the following version coding: M.m
(e.g. SPECapcSM for Pro/ENGINEER 20.0 v1.1) M is the major release
number and m is the minor release number.
- The major release number is only incremented
when large amounts of code are changed and the scripting language is
dramatically changed as a result -- backward compatibility is highly unlikely
when moving scripts or data sets between major releases (e.g. running v2
scripts on a v3 executable would almost certainly fail).
- The minor release number is bumped if some small
set of code is replaced or removed - but the standard, unchanged scripts and
data sets, as a whole, must run on the new version (but perhaps with different
performance).
- When there is a new major release of a
benchmark, submissions using the previous release will be accepted for at least
one submission cycle. When a superceded benchmark is retired, the associated
run-rules will be archived in an "archived benchmark run-rules
document" posted on the Previous SPECapc Benchmarks web-page.
- Submission, Review and Publication
- General Benchmark Run Rules
- The system under test must correctly perform all
of the operations being requested by the application during the benchmark.
- No changes to any files associated with the
benchmark are permitted excepted as noted in the benchmark-specific rules
(section 5 of this document).
- The entire display raster must be available for
use by the application being benchmarked.
- It is not permissible to override the intended
behavior of the application through any means including, but not limited to,
registry settings or environment variables.
- No interaction is allowed with the system under
test during the benchmark, unless required by the benchmark.
- The system under test can not skip frames during
the benchmark run.
- It is not permissible to change the system
configuration during the running of a given benchmark. That is, one can't power
off the system, make some changes, then power back on and run the rest of the
benchmark.
- Results submitted must be obtained using the
scripts, models, and application revisions which are specified for that
submission cycle by the SPECapc.
- The color depth used must be at least 24 bits
(true color), with at least 8 bits of red, 8 bits of green and 8 bits of blue,
unless otherwise specified.
- The display raster resolution must be at least
1280 pixels by 1024 pixels, unless otherwise specified.
- The monitor refresh rate must be at least 75Hz.
This requirement does not apply to digital flat panel displays, unless
otherwise specified.
- The border width of the windows created during
the benchmark shall not exceed 10 pixels, unless otherwise specified.
- The monitor used in the benchmark must support
the stated resolution and refresh rate.
- The benchmark must successfully obtain all
requested window sizes, with no reduction or clipping of any benchmark-related
windows. Windows created by the benchmark must not be obscured on the
screen by anything other than other elements created by the benchmark.
- Tests may be run with or without a
desktop/window manager if the application allows this, but must be run on some
native windowing system.
- It is not permissible to interact with the
system under test to influence the progress of the benchmark during execution.
- General Submission Content Rules
- The submission upload file structures are
defined in the benchmark-specific section below.
- Submission Process Rules
- The submission file names are detailed below
under the benchmark-specific rules.
- Review Period Rules
- Reviewers will decide if the image quality and
results of the submission are sufficiently correct with respect to the intent
of the ISV to satisfy the intended end-users' expectations.
- SPECapc Benchmark Specific Rules and
Procedures
- NewTek Lightwave
3D 9.6
- The benchmark must be run using NewTek LightWave
3D version 9.6.
- The display raster resolution must be at least
1920 pixels by 1200 pixels, unless the system has an integrated display which
cannot achieve this resolution (example: a notebook), in which case the
system's maximum possible resolution must be used.
- Submissions run on Microsoft Windows will be
accepted for review. Submissions run on Apple Mac OS X will not be accepted for
review at this time.
- The submission must contain the files
"result.txt" and "result.xls".
- The directory structure of the submission must
be as follows:
.../company-name/system_1/lightwave96/result.txt
.../company-name/system_1/lightwave96/result.xls
.../company-name/system_2/lightwave96/result.txt
.../company-name/system_2/lightwave96/result.xls
etc...
- The submission file must be named company_apc_lightwave96_vN.zip
where company is the member company or organization name in lower case
and vN is the file version (e.g. dell_apc_lightwave96_v0.zip.) The initial
submission is v0. Resubmitted files must have the version number incremented.
- Autodesk 3ds max
2011
- The benchmark must be run using Autodesk 3ds Max
2011 with Service Pack 1 (SP1) applied. Do not install the 3ds Max Subscription
Advantage Pack, as it will interfere with the operation and results of the
benchmark.
- The default 3ds Max 2011 application settings
must be used.
- The benchmark may only be run using DirectX.
- For DirectX submissions, an application graphics
driver or plug-in may be used, but the images captured from the benchmark must
not differ from 3ds max's respective native DirectX driver's images by more
than 5%. If the application driver's or plug-in's images do not match, the
submitter may apply for a waiver based on documented errors in the respective
3ds max native driver. The availability and support of an application driver or
plug-in shall not affect the submission's single
supplier or multiple supplier status.
- The display resolution must be at least 1920
pixels by 1080 pixels. If the system has an integrated display which cannot
achieve 1920x1080 resolution, e.g., a notebook, the system's maximum possible
resolution must be used.
- It is suggested that the 3ds Max 2011
Professional benchmark be run with at least 16GB of RAM installed. It is
recommended that the Personal version of the benchmark be run with 8GB of RAM
installed.
- The 3ds Max Professional benchmark is only
supported on systems running Microsoft Windows Win7 64-bit operating system.
- The 3ds Max Personal version of the benchmark is
unsupported, but it is recommended for Win7 32-bit and 64-bit.
- The 3ds Max Personal version requires the 3GB
flag for Win7 32-bit.
- Only submissions with the 3ds Max Professional version
of the benchmark run on Microsoft Windows Win7-64 operating system will be
accepted for review. For submissions, the official run of the benchmark must be
executed, which includes 4 iterations of all tests in the benchmark.
- The 3ds Max 2011 window must be maximized,
filling the screen. The window must not be occluded by any other windows.
Task-bar auto-hide must be enabled. Any windows on top of the application may
interfere with the performance.
- The submission must contain the files results1.xml through results4.xml and specAPC.xlsm.
- The directory structure of the submission must
be as follows:
.../company-name/system_1/3dsmax2011/result.txt
.../company-name/system_1/3dsmax2011/results1.xml
.../company-name/system_1/3dsmax2011/results2.xml
.../company-name/system_1/3dsmax2011/results3.xml
.../company-name/system_1/3dsmax2011/results4.xml
.../company-name/system_1/3dsmax2011/specAPC.xlsm
.../company-name/system_1/3dsmax2011/result.txt
.../company-name/system_2/3dsmax2011/results1.xml
.../company-name/system_2/3dsmax2011/results2.xml
.../company-name/system_2/3dsmax2011/results3.xml
.../company-name/system_2/3dsmax2011/results4.xml
.../company-name/system_2/3dsmax2011/specAPC.xlsm
etc...
- The submission file must be named
company_apc_3dsmax2011_vN.zip where company is the member company or
organization name in lower case and vN is the file version (e.g.
dell_apc_3dsmax2011_v0.zip.) The initial submission is v0. Resubmitted files
must have the version number incremented.
- Siemens PLM NX 6
- The benchmark must be run using build
Maintenance Release 6.0.5.3 of Siemens PLM NX 6.
- The NX 6 benchmark is only supported on systems
running Microsoft Windows 7 64-bit operating system.
- The display resolution must be at least 1920
pixels by 1080 pixels. If the system has an integrated display which cannot
achieve 1920x1080 resolution, e.g., a notebook, the system's maximum possible
resolution must be used.
- The batch file run_SPECapcNX6bmk.bat may only be
modified to change the location of the benchmark folder. If modified, the
modified version must be included in the benchmark submission.
- The NX 6 application window must be maximized
and all toolbars, the resource bar, and history resource bar must be turned
off.
- The NX 6 Fixed
Frame Rate setting must be globally disabled.
- The NX 6 Disable
Translucency setting is optional and must be documented in the Comments
field of the result.txt file.
- The NX 6 Disable
Plane Translucency, Backface Culling,
and Disable Line Antialiasing settings under Preferences>Visualization Performance>General Graphics
must be unchecked.
- All other NX 6 application settings should be
default. See the document, SPECapc_NX6_BenchmarkSetup, for the full details of
the benchmark setup.
- If the NX 6 graphics visualization registry
setting is altered, this must be documented in the Comments field of the Result.txt file.
- Submissions run on Microsoft Windows 7 64-bit
will be accepted for review.
- The submission must contain the ResultsSummary.htm, Result.txt, CompeteStats.csv, FinalScores.csv and timer_data01-05.txt files generated from
running the benchmark. These files may be found in the Results directory after a successful run
of the benchmark. The
first section of the Result.txt file
must be edited to reflect the system configuration or edit the config.txt file before the benchmark
is run.
- The directory structure of the submission must
be as follows:
.../company-name/system_1/nx6/ResultsSummary.htm
.../company-name/system_1/nx6/Result.txt
.../company-name/system_1/nx6/CompleteStats.csv
.../company-name/system_1/nx6/FinalScores.csv
.../company-name/system_1/nx6/timer_data01.txt
.../company-name/system_1/nx6/timer_data02.txt
.../company-name/system_1/nx6/timer_data03.txt
.../company-name/system_1/nx6/timer_data04.txt
.../company-name/system_1/nx6/timer_data05.txt
.../company-name/system_1/nx6/run_SPECapcNX6bmk.bat (if modified,
as required by rule d above)
.../company-name/system_2/nx6/ResultsSummary.htm
.../company-name/system_2/nx6/Result.txt
.../company-name/system_2/nx6/CompleteStats.csv
.../company-name/system_2/nx6/FinalScores.csv
.../company-name/system_2/nx6/timer_data01.txt
.../company-name/system_2/nx6/timer_data02.txt
.../company-name/system_2/nx6/timer_data03.txt
.../company-name/system_2/nx6/timer_data04.txt
.../company-name/system_2/nx6/timer_data05.txt
.../company-name/system_2/nx6/run_SPECapcNX6bmk.bat (if modified,
as required by rule d above)
etc...
- The submission file must be named
company-name_apc_nx6_vN.zip where company-name is the member company or
organization name in lower case and vN is the file version (e.g.
hp_apc_nx6_v0.zip.) The initial submission is v0. Resubmitted files must have
the version number incremented.
- PTC Creo 2.0
- The benchmark must be
run using build F000 or F001 of Creo 2.0.
- The Creo 2.0 benchmark is only supported on
systems running Microsoft Windows 7 64-bit operating system.
- Only submissions run on Microsoft Windows 7
64-bit operating system and with graphics cards supporting a minimum of 8X MSAA
will be accepted for review.
- The display resolution must be at least 1920
pixels by 1080 pixels. If the system has an integrated display which cannot
achieve 1920x1080 resolution, e.g., a notebook, the system's maximum possible
resolution must be used.
- It is recommended that the Creo 2.0 benchmark be
run with at least 8GB of RAM installed.
- The All.txt, config.pro and apc.dat files must be used as-is and may
not be modified or overridden.
- The benchmark may not
use any other config.pro, config.sup, config.win, menu_def.pro,
or protk.dat files. These files must
be removed from the application install location and user folder before running
the benchmark.
- The graphics card used must support a minimum of
8X MSAA for comparable results.
- The script file utilities\runbench.bat may be modified to
accommodate an alternate install location for Creo 2.0. If modified, the
modified version of runbench.bat must
be included in the benchmark submission.
- The utilities\config.txt file must be edited
to reflect the system configuration before running the benchmark.
- The submission must
contain the result.txt, trail.txt, score-steps.txt, and apc.dat files. These files may be found in the results directory after a successful run of the benchmark.
- There must be no
license-related warnings in the trail.txt file.
- The directory
structure of the submission must be as follows:
.../company-name/system_1/creo2/result.txt
.../company-name/system_1/creo2/trail.txt
../company-name/system_1/creo2/score-steps.txt
.../company-name/system_1/creo2/apc.dat
.../company-name/system_1/creo2/runbench.bat (if modified, as
required by rule i above)
../company-name/system_2/creo2/result.txt
.../company-name/system_2/creo2/trail.txt
../company-name/system_2/creo2/score-steps.txt
.../company-name/system_2/creo2/apc.dat
.../company-name/system_2/creo2/runbench.bat (if modified, as
required by rule i above)
etc...
- The submission file must be named
company-name_apc_creo2_vN.zip where company-name is the member company or
organization name in lower case and vN is the file version (e.g. dell_apc_creo2_v0.zip.)
The initial submission is v0. Resubmitted files must have the version number
incremented.
- Autodesk Maya
2012
- The benchmark must be run using Autodesk Maya
2012 Service Pack 2.
- The Maya 2012 benchmark is only supported on
systems running Microsoft Windows 7 64-bit operating system.
- Only submissions run on Microsoft Windows 7
64-bit will be accepted for review.
- The display resolution must be at least 1920
pixels by 1080 pixels. If the system has an integrated display which cannot
achieve 1920x1080 resolution, e.g., a notebook, the system's maximum possible
resolution must be used.
- The submission must contain the files result.txt and maya2012APC.xls.
- The directory structure of the submission must
be as follows:
.../company-name/system_1/maya2012/result.txt
.../company-name/system_1/maya2012/maya2012APC.xls
.../company-name/system_2/maya2012/result.txt
.../company-name/system_2/maya2012/maya2012APC.xls
etc...
- The submission file must be named
company_apc_maya2012_vN.zip where company is the member company or organization
name in lower case and vN is the file version (e.g. fjs_apc_maya2012_v0.zip.)
The initial submission is v0. Resubmitted files must have the version number
incremented.
- Solidworks 2013
- The benchmark must be run using Dassault SolidWorks 2013 Service Pack 1.
- The SolidWorks 2013 benchmark is only supported on systems running Microsoft Windows 7 64-bit operating system.
- Only submissions run on Microsoft Windows 7 64-bit operating system and with graphics cards fully supporting SolidWorks features RealView and Ambient Occlusion will be accepted for review.
- The graphics card used must fully support SolidWorks features RealView and Ambient Occlusion for comparable results.
- The display resolution must be at least 1920 pixels by 1080 pixels. If the system has an integrated display which cannot achieve 1920x1080 resolution, e.g., a notebook, the system's maximum possible resolution must be used.
- It is recommended that the SolidWorks 2013 benchmark be run with at least 8GB of RAM installed.
- The RunSW2013Bench.bat, SW2013Bench.dat and model files must be used as-is and may not be modified or overridden.
- The config.txt file must be edited to reflect the system configuration before running the benchmark.
- The benchmark must be run using the default application settings, including those for Photoview 360.
- It is recommended that the system be rebooted before running the benchmark, and that nothing obscures the SolidWorks window, including taskbar, application pop-ups, and other applications. Refrain from interacting with the system while the benchmark is running.
- The submission must contain the result.txt, SW2013BenchDetails.csv, and SW2013BenchLog.txt files. These files may be found in the results directory after a successful run of the benchmark.
- The directory structure of the submission must
be as follows:
.../company-name/system_1/solidworks2013/result.txt
.../company-name/system_1/solidworks2013/SW2013BenchDetails.csv
.../company-name/system_1/solidworks2013/SW2013BenchLog.txt
.../company-name/system_2/solidworks2013/result.txt
.../company-name/system_2/solidworks2013/SW2013BenchDetails.csv
.../company-name/system_2/solidworks2013/SW2013BenchLog.txt
etc...
- The submission file must be named company_apc_solidworks2013_vN.zip where company is the member company or organization name in lower case and vN is the file version (e.g. hp_apc_maya2012_v0.zip.) The initial submission is v0. Resubmitted files must have the version number incremented.
Adoption
v2.17: Adopted 20 February 2013 (adds SolidWorks 2013 and
deletes retired SolidWorks 2007 benchmark)
v2.16: Adopted 6 August 2012 (PTC Creo 2 rule a. change, adds Maya 2012 and
deletes retired Maya 2009 benchmark)
v2.15: Adopted 22 May 2012 (adds PTC Creo 2 and deletes retired WF2
benchmark)
v2.14: Adopted 30 March 2012 (adds Siemens NX 6 and deletes retired benchmark)
v2.13: Adopted 30 June 2011 (adds Autodesk 3ds max 2011 rules and deletes
retired benchmarks)
v2.12: Adopted 24 June 2009 (adds Autodesk Maya 2009 rules)
v2.11: Adopted 12 March 2009 (adds Lightwave 9.6 rules, updates NX4 rules, adds
non-interaction rule)
v2.10: Adopted 13 September 2007 (reflects GPC-to-GWPG transition, and adds NX4
rules)
v2.06: Adopted 13 July 2007
v2.05: Adopted 18 April 2007
v2.04: Adopted 06 February 2007
v2.03: Adopted 20 July 2006
v2.02: Adopted 27 April 2006 for 01 May 2006 publication
v2.01: Adopted 26 January 2006
v2.00: Adopted 26 January 2006
|