SPEC Fair Use Rules
Updated 4 December 2024 [view the change history]
Introduction
Consistency and fairness are guiding principles for SPEC. To help assure that these principles are met, the following requirements must be met by any organization or individual who makes public use of SPEC benchmark results.
Section I lists general requirements that apply to public use of all SPEC benchmarks. Section II lists additional specific requirements for individual benchmarks.
It is intended that this document provides the information needed for compliance with Fair Use, and in the event of any inconsistencies, this document takes precedence over individual benchmark run rules fair use requirements.
I. General Requirements For Public Use of All SPEC Benchmark Results
I.A. Requirements List
- Compliance. Claimed results must be compliant with that benchmark's rules. See definition: compliant result. (Certain Exceptions may apply.)
- Data Sources
- Source(s) must be stated for quoted SPEC results.
- Such sources must be publicly available, from SPEC or elsewhere.
- The licensee (the entity responsible for the result) must be clearly identifiable from the source.
- The date that the data was retrieved must be stated.
- The SPEC web site (http://www.spec.org) or a suitable sub page must be noted as a resource for additional information about the benchmark.
Clear and correct, as of a specific date
Statements regarding SPEC, its benchmarks, and results published by SPEC, must be clear and correct.
A claim must state a date as of which data was retrieved.
A claim may compare newly announced compliant results vs. data retrieved earlier.
There is no requirement to update a claim when later results are published.
For example, an Acme web page dated 28 January 2011 announces performance results for the Model A and claims "the best SPECweb® 2009 benchmark performance when compared vs. results published at www.spec.org as of 26 January 2011". If SPEC publishes better results on 1 February, there is no requirement to update the page.
Trademarks
Reference must be made to the SPEC trademark. Such reference may be included in a notes section with other trademark references (SPEC trademarks are listed at http://www.spec.org/spec/trademarks.html).
SPEC's trademarks may not be used to mislabel something that is not a SPEC metric.
For example, suppose that a Gaming Society compares performance using a composite of a weighted subset of the SPEC CPU 2006 benchmark plus a weighted subset of the SPECviewperf 11 benchmark, and calls its composite "GamePerfMark". The composite, weighting, and subsetting are done by the Society, not by SPEC. The composite may be useful and interesting, but it may not be represented as a SPEC metric. It would be a Fair Use violation to reference it as "SPECgame".
Required Metrics. In the tables below, some benchmarks have Required Metrics. Public statements must include these.
Comparisons. It is fair to compare compliant results to other compliant results. Enabling such comparisons is a core reason why the SPEC benchmarks exist. Each benchmark product has workloads, software tools, run rules, and review processes that are intended to improve the technical credibility and relevance of such comparisons.
When comparisons are made,
- SPEC metrics may be compared only to SPEC metrics.
- The basis for comparison must be stated.
- Results of one benchmark are not allowed to be compared to a different benchmark (e.g. SPECjAppServer2004 to TPC-C; or SPECvirt_sc2010 to SPECweb 2005).
- Results of a benchmark may not be compared to a different major release of the same benchmark (e.g. SPECweb 2005 to SPECweb 2009). Exception: normalized historical comparisons may be made as described under Retired Benchmarks.
- Comparisons of non-compliant numbers. The comparison of non-compliant numbers to compliant results is restricted to certain exceptional cases described later in this Fair Use Rule (Academic/Research usage; Estimates, for those benchmarks that allow estimates; Normalized Historical Comparisons). Where allowed, comparisons that include non-compliant numbers must not be misleading or deceptive as to compliance. It must be clear from the context of the comparison which numbers are compliant and which are not.
I.B. Generic Example
This example for a generic SPEC benchmark illustrates the points above. See also the examples for specific benchmarks below, for additional requirements that may apply.
Example: Example: New York, NY, January 28, 2011: Acme Corporation announces that the Model A achieves 100 for the SPECgeneric 2011 benchmark, a new record among systems running Linux [1].
[1] Comparison based on best performing systems using the Linux operating system published at www.spec.org as of 26 January 2011. SPEC® and the benchmark name SPECgeneric® are registered trademarks of the Standard Performance Evaluation Corporation. For more information about the SPECgeneric 2011 benchmark, see www.spec.org/generic2011/.
I.C. Compliance Exceptions
Exceptions regarding the compliance requirement are described in this section.
- Academic/research usage. SPEC encourages use of its benchmarks in research and academic contexts, on the grounds that SPEC benchmarks represent important characteristics of real world applications and therefore research innovations measured with SPEC benchmarks may benefit real users. SPEC understands that academic use of the SPEC benchmarks may be seen as enhancing the credibility of both the researcher and SPEC.
Research use of SPEC benchmarks may not be able to meet the compliance requirement.
Examples: (1) Testing is done with a simulator rather than real hardware. (2) The software innovation is not generally available or is not of product quality. (3) The SPEC test harness is modified without approval of SPEC.SPEC has an interest in protecting the integrity of the SPEC metrics, including consistency of methods of measurement and the meaning of the units of measure that are defined by SPEC benchmarks. It would be unfair to those who do meet the compliance requirements if non-compliant numbers were misrepresented as compliant results. Therefore, SPEC recommends that researchers consider using the SPEC workload, but do not call the measurements by the SPEC metric name.
The requirements for Fair Use in academic/research contexts are:
It is a Fair Use violation to imply, to the reasonable reader, that a non-compliant number is a compliant result.
Non-compliance must be clearly disclosed. If the SPEC metric name is used, it is recommended that (nc), for non-compliant, be added after each mention of the metric name. It is understood that there may be other ways to accomplish this in context, for example adding words such as "experimental" or "simulated" or "estimated" or "non-compliant".
Diagrams, Tables, and Abstracts (which, often, are excerpted and used separately) must have sufficient context on their own so that they are not misleading as to compliance.
If non-compliant numbers are compared to compliant results it must be clear from the context which is which.
Example: The Acme Corporation Model A achieves SPECint®2006 100 in testing published at www.spec.org. Our Research Compiler improves the same hardware to SPECint®2006 125(nc). The notation (nc), for non-compliant, is used because our compiler does not meet SPEC's requirements for general availability.
Other Fair Use Requirements Still Apply. This section discusses an exception to only the compliance requirement from the Requirements List. Fair Use in academic/research context must still meet the other requirements, including but not limited to making correct use of SPEC results with dated citations of sources.
Estimates. Some SPEC benchmarks allow estimates, as shown in the tables below. Only for those benchmarks, it is acceptable to compare estimates to compliant results provided that:
Estimates must be clearly identified as such.
Each use of a SPEC metric as an estimate must be clearly marked as an estimate.
If estimates are used in graphs, the word "estimated" or "est." must be plainly visible within the graph, for example in the title, the scale, the legend, or next to each individual number that is estimated.
Licensees are encouraged to give a rationale or methodology for any estimates, together with other information that may help the reader assess the accuracy of the estimate.
Example 1: The Acme Corporation Model A achieves SPECint®2006 100 in testing published at www.spec.org. The Bugle Corporation Model B will nearly double that performance to SPECint®2006 198(est). The notation (est), for estimated, is used because SPECint®2006 was run on pre-production hardware. Customer systems, planned for Q4, are expected to be similar.
Example 2: Performance estimates are modeled using the cycle simulator GrokSim Mark IV. It is likely that actual hardware, if built, would include significant differences.
I.D. Derived Values
It is sometimes useful to define a numeric unit that includes a SPEC metric plus other information, and then use the new number to compare systems. This is called a Derived Value.
Examples:
SPECint®_rate2006 per chip
SPECvirt_sc®2010 per gigabyte
Note: the examples above are not intended to imply that all derived values use ratios of the form above. The definition is intentionally broad, and includes additional examples
Derived values are acceptable, provided that they follow this Fair Use rule, including but not limited to using compliant results, listing sources for SPEC result data, and including any required metrics.
A derived value must not be represented as a SPEC metric. The context must not give the appearance that SPEC has created or endorsed the derived value. In particular, it is a Fair Use violation, and may be a Trademark violation, to form a new word that looks like a SPEC metric name when there is no such metric.
Not Acceptable:
SPECint®_chiprate2006
SPECvirt_sc®2010gigs
If a derived value is used as the basis of an estimate, the estimate must be correctly labeled. A derived value may introduce seeming opportunities to extrapolate beyond measured data. For example, if 4 different systems all have the same ratio of SPECwhatever per chip, it can be tempting to estimate that another, unmeasured, system will have the same ratio. This may be a very good estimate; but it is still an estimate, and must be correctly labeled. If used in public, it must be for a benchmark that allows estimates.
I.E. Non-SPEC Information
A basis for comparison or a derived value may use information from both SPEC and non-SPEC sources.
SPEC values truthfulness and clarity at all times:
When information from SPEC sources is used in public, SPEC requires that such information be reported correctly (per section I.A.3).
SPEC recommends that non-SPEC material should be accurate, relevant, and not misleading. Data and methods should be explained and substantiated.
Disclaimer. SPEC is not responsible for non-SPEC information. The SPEC Fair Use rule is limited to the information derived from SPEC sources. (Other rules may apply to the non-SPEC information, such as industry business standards, ethics, or Truth in Advertising law.)
SPEC may point out non-SPEC content. SPEC reserves the right to publicly comment to distinguish SPEC information from non-SPEC information.
Integrity of results and trademarks. The non-SPEC information must not be presented in a manner that may reasonably lead the reader to untrue conclusions about SPEC, its results, or its trademarks.
Examples
Example 1 (basis): ACME Corporation claims the best SPECjEnterprise 2010 benchmark performance for systems available as (example 1a) rack mount, or (1b) with more than 8 disk device slots, or (1c) with Art Deco paint. Bugle Corporation asserts that the basis of comparison is irrelevant or confusing or silly. Bugle may be correct. Nevertheless, such irrelevance, confusion, or silliness would not alone be enough to constitute a SPEC Fair Use violation.
Example 2 (derived value): ACME claims that its model A has better SPECint®_rate2006 per unit of cooling requirement than does the Bugle Model B. SPEC is not responsible for judging thermal characteristics.
Example 3: ACME claims the "best SPECmpi®M_2007 performance among industry-leading servers". This claim violates the requirement that the basis must be clear.
Example 4: ACME computes SPECint®_rate2006 per unit of cooling, but inexplicably selects SPECint®_rate_base2006 for some systems and SPECint®_rate2006 for others. The computation violates the requirement that the SPEC information must be accurate, and may also violate the requirement that a claim should not lead the reasonable reader to untrue conclusions about SPEC's results.
I.F. Retired Benchmarks
Disclosure. If public claims are made using a retired benchmark, with compliant results that have not been previously reviewed and accepted by SPEC, then the fact that the benchmark has been retired and new results are no longer being accepted for review and publication by SPEC must be plainly disclosed.
Example: he Acme Corporation Model A achieves a score of 527 SPECjvm 98. Note: SPECjvm 98 has been retired and SPEC is no longer reviewing or publishing results with that benchmark. We are providing this result as a comparison to older hardware that may still be in use at some customer sites.
Benchmarks that require review. Some benchmarks require that SPEC review and accept results prior to public use. For such benchmarks, the review process is not available after benchmark retirement, and therefore no new results may be published.
- Normalized historical comparisons. When SPEC releases a new major version of a benchmark, the SPEC metrics are generally not comparable to the previous version, and there is no formula for converting from one to the other. Nevertheless, SPEC recognizes that there is value in historical comparisons, which are typically done by normalizing performance across current and one or more generations of retired benchmarks, using systems that have been measured with both the older and newer benchmarks as the bridges for the normalization. Historical comparisons are inherently approximate because picking differing 'bridge' systems may yield differing ratios and because an older workload exercises different system capabilities than a more modern workload.
Normalized historical comparisons are acceptable only if their inherently approximate nature is not misrepresented. At minimum:
It must not be claimed that SPEC metrics for one benchmark generation are precisely comparable to metrics from another generation.
The approximate nature must be apparent from the context.
For example, a graph shown briefly in a presentation is labelled "Normalized Historic Trends for SPEC<benchmark>". As another example, in a white paper (where the expectation is for greater detail than presentations), the author explicitly calls out that workloads have differed over time, and explains how numbers are calculated.
II. Requirements for Public Use of Individual Benchmark Results
For further detail about the meaning of SPEC metrics, the individual benchmark run rules may be consulted. The benchmark names at the top of each table are links to that benchmark's run rules.
SPEC ACCEL® benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | For an individual benchmark, if a result other than the median is mentioned, then the median from the same set must also be mentioned. |
Use of Estimates | Estimates are allowed if clearly identified. Power measurement metrics are not allowed to be estimated |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for 3ds Max 9 benchmark
RETIRED | The SPECapc for 3ds Max 9 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Rendering Composite, Graphics Composite, Shaders Composite |
Required Metrics | Rendering Composite, Graphics Composite |
Conditionally Required Metrics | Shaders composite if using DirectX mode |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for 3dsmax 2015 benchmark
RETIRED | The SPECapc for 3dsmax 2015 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite, Graphics Composite, Large Model Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECapc® for 3dsmax 2020 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite, GPU Composite, Large Model Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECapc® for LightWave 3D v9.6 benchmark
RETIRED | The SPECapc for LightWave 3D V9.6 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Interactive Composite, Render Composite, Multitask Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECapc® for Maya 2009 benchmark
RETIRED | The SPECapc for Maya 2009 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics, CPU, I/O, Overall Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECapc® for Maya 2012 benchmark
RETIRED | The SPECapc for Maya 2012 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics Composite, CPU Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for Maya 2017 benchmark
RETIRED | The SPECapc for Maya 2017 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics Interactive Composite, Graphics Animation Composite, Graphics GPGPU Composite, CPU Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECapc® for Maya 2023 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite, GPU Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for Pro/ENGINEER Wildfire 2.0 benchmark
RETIRED | The SPECapc for Pro/ENGINEER Wildfire 2.0 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics Wireframe, Graphics Shaded, CPU, I/O, File Time, Overall Composite, |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for PTC Creo 2.0 benchmark
RETIRED | The SPECapc for PTC Creo 2.0 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics Composite, CPU Composite, I/O Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for PTC Creo 3.0 benchmark
RETIRED | The SPECapc for PTC Creo 3.0 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics Composite, CPU Composite, I/O Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECapc® for PTC Creo 9 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite, GPU Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECapc® for Siemens NX 9 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite, Graphics Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECapc® for Siemens NX 10 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite, Graphics Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for Solid Edge V19 benchmark
RETIRED | The SPECapc for Solid Edge V19 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite, Graphics Composite, File I/O Composite, Overall Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for Solidworks 2007 benchmark
RETIRED | The SPECapc for Solidworks 2007 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Intensive, File I/O Intensive, Graphics Composite, Overall Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for Solidworks 2013 benchmark
RETIRED | The SPECapc for Solidworks 2013 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics Composite, CPU Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for Solidworks 2015 benchmark
RETIRED | The SPECapc for Solidworks 2015 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics Composite, CPU Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for Solidworks 2017 benchmark
RETIRED | The SPECapc for Solidworks 2017 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics Composite – FSAA Disabled, Graphics Composite – FSAA Enabled, CPU Composite, Shaded Graphics Sub-Composite, Shaded with Edges Graphics Sub-Composite, Shaded using RealView Graphics Sub-Composite, Shaded with Edges using RealView Graphics Sub-Composite, Shaded using RealView and Shadows Graphics Sub-Composite, Shaded with Edges using RealView and Shadows Graphics Sub-Composite, Shaded using RealView and Shadows and Ambient Occlusion Graphics Sub-Composite, Shaded with Edges using RealView and Shadows and Ambient Occlusion Graphics Sub-Composite, Wireframe Graphics Sub-Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for Solidworks 2019 benchmark
RETIRED | The SPECapc for Solidworks 2019 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics Composite – FSAA Disabled, Graphics Composite – FSAA Enabled, CPU Composite, CPU Raytrace Composite, CPU Rebuild Composite,, Shaded Graphics Sub-Composite, Shaded with Edges Graphics Sub-Composite, Shaded using RealView Graphics Sub-Composite, Shaded with Edges using RealView Graphics Sub-Composite, Shaded using RealView and Shadows Graphics Sub-Composite, Shaded with Edges using RealView and Shadows Graphics Sub-Composite, Shaded using RealView and Shadows and Ambient Occlusion Graphics Sub-Composite, Shaded with Edges using RealView and Shadows and Ambient Occlusion Graphics Sub-Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for Solidworks 2020 benchmark
RETIRED | The SPECapc for Solidworks 2020 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics Composite – FSAA Disabled, Graphics Composite – FSAA Enabled, CPU Composite, CPU Raytrace Composite, CPU Rebuild Composite, CPU Convert Composite, CPU Simulate Composite, Shaded Graphics Sub-Composite, Shaded with Edges Graphics Sub-Composite, Shaded using RealView and Shadows and Ambient Occlusion Graphics Sub-Composite, Shaded with Edges using RealView and Shadows and Ambient Occlusion Graphics Sub-Composite, Drawing Sub-Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for Solidworks 2021 benchmark
RETIRED | The SPECapc for Solidworks 2021 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | Graphics Composite – FSAA Disabled, Graphics Composite – FSAA Enabled, CPU Composite, CPU Raytrace Composite, CPU Rebuild Composite, CPU Convert Composite, CPU Simulate Composite, Shaded Graphics Sub-Composite, Shaded with Edges Graphics Sub-Composite, Shaded using RealView and Shadows and Ambient Occlusion Graphics Sub-Composite, Shaded with Edges using RealView and Shadows and Ambient Occlusion Graphics Sub-Composite, Drawing Sub-Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECapc® for Solidworks 2022 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite Score, GPU Composite Score, GPU Shaded SubTest, GPU Shaded with Edges SubTest, GPU Shaded RealView SubTest, GPU Shaded RealView With Edges SubTest, GPU Drawing SubTest, CPU Raytrace SubTest, CPU Rebuild SubTest, CPU Convert SubTest, CPU Simulate SubTest, CPU Mass Properties SubTest, CPU Boolean SubTest |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECapc® for Solidworks 2024 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite Score, GPU Composite Score, GPU Shaded SubTest, GPU Shaded with Edges SubTest, GPU Hidden Line Removal SubTest, GPU Shaded RealView SubTest, GPU Shaded RealView With Edges SubTest, GPU Drawing SubTest, CPU Raytrace SubTest, CPU Rebuild SubTest, CPU Convert SubTest, CPU Simulate SubTest, CPU Mass Properties SubTest, CPU Boolean SubTest |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for UGS NX4 benchmark
RETIRED | The SPECapc for UGS NX4 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite, File I/O Composite, Graphics Composite, Overall Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for UGS NX6 benchmark
RETIRED | The SPECapc for UGS NX6 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite, File I/O Composite, Graphics Composite, Overall Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECapc® for UGS NX8.5 benchmark
RETIRED | The SPECapc for UGS NX8.5 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics | CPU Composite, File I/O Composite, Graphics Composite, Overall Composite |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPEC Cloud® IaaS 2016 benchmark
RETIRED | The SPEC Cloud IaaS 2016 benchmark was retired on March 6, 2019 in favor of its successor, the SPEC Cloud® IaaS 2018 benchmark.
|
SPEC.org Submission Requirements | Results must be reviewed and accepted by SPEC prior to public disclosure. |
SPEC Metrics |
|
Required Metrics |
The required metric must be listed in close proximity to any other measured data from the disclosure or any derived value. |
Conditionally Required Metrics | The Elasticity Start and End Times must be reported when any of the cloud's resources are not under complete control of the tester. The Test Region(s) must be listed in close proximity to the Elasticity Start and End times for the test. |
Use of Estimates | Not allowed. |
Disallowed Comparisons | In addition to the requirements that results not be compared to other benchmarks: 1. The SPEC Cloud IaaS 2016 benchmark uses specific versions of the Yahoo! Cloud Serving Benchmark (YCSB) and K-Means clustering workload from the HiBench Suite as its component workloads as these are established industry-standard workloads. These workloads are run with very specific parameterized constraints specific to this SPEC benchmark to focus on stressing particular aspects of the SUT's resources typical of Cloud IaaS environments. As such, the differences are significant enough that comparisons between the results generated by SPEC Cloud IaaS 2016 benchmark and the original component workloads are not allowed. |
SPEC Cloud® IaaS 2018 benchmark
SPEC.org Submission Requirements | Results must be reviewed and accepted by SPEC prior to public disclosure. |
SPEC Metrics |
|
Required Metrics |
The required metric must be listed in close proximity to any other measured data from the disclosure or any derived value. |
Conditionally Required Metrics | The Scale-out Start and End Times must be reported when any of the cloud's resources are not under complete control of the tester The Test Region(s) must be listed in close proximity to the Scale-out Start and End times for the test. |
Use of Estimates | Not allowed. |
Disallowed Comparisons | In addition to the requirements that results not be compared to other benchmarks: 1. The SPEC Cloud IaaS 2018 benchmark uses specific versions of the Yahoo! Cloud Serving Benchmark (YCSB) and K-Means clustering workload from the HiBench Suite as its component workloads as these are established industry-standard workloads. These workloads are run with very specific parameterized constraints specific to this SPEC benchmark to focus on stressing particular aspects of the SUT's resources typical of Cloud IaaS environments. As such, the differences are significant enough that comparisons between the results generated by SPEC Cloud IaaS 2018 benchmark and the original component workloads are not allowed. 2. SPEC Cloud IaaS 2018 metrics are not comparable to SPEC Cloud IaaS 2016 due to changes to workload parameters and metric methodology. |
(Retired) SPEC CPU® 2000 benchmark
RETIRED | The SPEC CPU 2000 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | If a run time other than the median is quoted, then the median must also be quoted. |
Use of Estimates |
|
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPEC CPU® 2006 benchmark
RETIRED | The SPEC CPU 2006 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | If a run time other than the median is quoted, then the median must also be quoted. |
Use of Estimates |
|
Use of Estimates |
|
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPEC CPU® 2017 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. | |||
SPEC Metrics |
|
|||
Required Metrics | The baseline performance metric for whatever suite is reported. | |||
Conditionally Required Metrics |
|
|||
Use of Estimates |
|
|||
Disallowed Comparisons | Energy metrics generated with releases prior to SPEC CPU 2017 v1.1 are not comparable. |
SPEC HPC™ 2021 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | For an individual benchmark, if a result other than the median is mentioned, then the median from the same set must also be mentioned. |
Other Required Information | CPU description (number of chips and cores), Accelerator description (if used), Parallel Model(s) used, and degree of parallelism (MPI ranks, host threads). |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECjAppserver® 2004 benchmark
RETIRED | The SPECjAppServer 2004 benchmark was retired on November 30, 2010 .
|
SPEC.org Submission Requirements | Results must be reviewed and accepted by SPEC prior to public disclosure. |
SPEC Metrics | SPECjAppServer®2004 JOPS@Category |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparison | In addition to the requirements that results not be compared to other benchmarks: Results between different categories (see the run rules section on Standard Vs. Distributed) within SPECjAppServer 2004 may not be compared. |
(RETIRED) SPECjbb® 2005 benchmark
RETIRED | The SPECjbb 2005 benchmark was retired on October 1, 2013 in favor of its successor, the SPECjbb 2013 benchmark.
|
SPEC.org Submission Requirements | Vendors may publish compliant results independently, provided that the first use of input.expected_peak_warehouse property by the vendor be reviewed by the subcommittee to determine compliance with run rules section 2.3. Future publications by the vendor using input.expected_peak_warehouse do not require review unless the technical reason for setting the flag differs from what was previously accepted by the subcommittee. |
SPEC Metrics | SPECjbb®2005 bops, SPECjbb®2005 bops/JVM |
Required Metrics | The number of jvms used in the benchmark must be stated Both throughput metrics (SPECjbb®2005 bops and SPECjbb®2005 bops/JVM) must be stated. |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(RETIRED) SPECjbb® 2013 benchmark
RETIRED | The SPECjbb 2013 benchmark was retired on December 9, 2014.
|
SPEC.org Submission Requirements | Vendors may publish compliant results independently, provided that run does not produce any warning or invalid messages and all run and reporting rules are followed. Any result which has warnings can only be used once accepted by the OSGjava subcommittee. |
SPEC Metrics | SPECjbb®2013-<category> max-jOPS and SPECjbb®2013-<category> critical-jOPS where >category> : [Composite / MultiJVM / Distributed] |
Required Metrics | Both metrics SPECjbb®2013-<category> max-jOPS and SPECjbb®2013-<category> critical-jOPS must be stated in proximity. |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | In addition to the requirements that results not be compared to other benchmarks:
|
SPECjbb® 2015 benchmark
SPEC.org Submission Requirements | Vendors may publish compliant results independently, provided that run does not produce any warning or invalid messages and all run and reporting rules are followed. Any result which has warnings can only be used once accepted by the OSGjava subcommittee. |
SPEC Metrics | SPECjbb®2015-<category> max-jOPS and SPECjbb®2015-<category> critical-jOPS where <category>: [Composite / MultiJVM / Distributed] |
Required Metrics | Both metrics SPECjbb®2015-<category> max-jOPS and SPECjbb®2015-<category> critical-jOPS must be stated in proximity. |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | In addition to the requirements that results not be compared to other benchmarks:
|
SPECjEnterprise® 2010 benchmark
SPEC.org Submission Requirements | Results must be reviewed and accepted by SPEC prior to public disclosure. |
SPEC Metrics | SPECjEnterprise®2010 EjOPS |
Required Metrics | SPECjEnterprise®2010 EjOPS |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECjEnterprise® 2018 Web Profile benchmark
SPEC.org Submission Requirements | Results must be reviewed and accepted by SPEC prior to public disclosure. |
SPEC Metrics | SPECjEnterprise®2018 WebjOps |
Required Metrics | SPECjEnterprise2018 WebjOps |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPEC JMS® 2007 benchmark
SPEC.org Submission Requirements | Results must be reviewed and accepted by SPEC prior to public disclosure. |
SPEC Metrics | SPECjms®2007@Category |
Required Metrics | SPECjms®2007@Category |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECjvm® 2008 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. |
SPEC Metrics | SPECjvm®2008 Base ops/m and SPECjvm®2008 Peak ops/m |
Required Metrics | SPECjvm®2008 Base ops/m and SPECjvm®2008 Peak ops/m |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECmail® 2001 benchmark
RETIRED | The SPECmail 2001 benchmark was retired on October 31, 2011.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. |
SPEC Metrics | SPECmail®2001 and SPECmail®2001_users |
Required Metrics | |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECmail® 2009 benchmark
RETIRED | The SPECmail 2009 benchmark was retired on October 31, 2011.
|
SPEC.org Submission Requirements | By location, as defined below |
SPEC Metrics | SPECmail®_Ent2009, SPECmail®_Ent2009Secure |
Required Metrics | |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPEC MPI® 2007 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | For an individual benchmark, if a result other than the median is mentioned, then the median from the same set must also be mentioned. |
Other Required Information | CPU description (number of chips and cores), and degree of parallelism (MPI ranks). |
Use of Estimates |
|
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPEC OMP® 2001 benchmark
RETIRED | The SPEC OMP 2001 benchmark was retired on January 16, 2013.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | None. |
Conditionally Required Metrics | For an individual benchmark, if a result other than the median is mentioned, then the median from the same set must also be mentioned. |
Other Required Information | CPU description (number of chips and cores), and degree of parallelism (OpenMP threads). |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPEC OMP® 2012 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | For an individual benchmark, if a result other than the median is mentioned, then the median from the same set must also be mentioned. |
Other Required Information | CPU description (number of chips and cores), and degree of parallelism (OpenMP threads). |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECpower_ssj® 2008 benchmark
SPEC.org Submission Requirements | By location, as defined below | ||||||||||||||||||||
SPEC Metrics |
|
||||||||||||||||||||
Required Metrics | SPECpower_ssj®2008 overall ssj_ops/watt The required metric must be listed in close proximity to any other measured data from the disclosure or any derived value. |
||||||||||||||||||||
Conditionally Required Metrics |
|
||||||||||||||||||||
Examples |
|
||||||||||||||||||||
Use of Estimates | Not allowed. | ||||||||||||||||||||
Disallowed Comparisons | In addition to the requirements that results not be compared to other benchmarks:
|
(Retired) SPEC SFS® 2008 benchmark
RETIRED | The SPEC SFS 2008 benchmark was retired on May 12, 2015.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | Peak SPEC SFS®2008_nfs or SFS®2008_cifs Ops/sec and the ORT (overall response time) |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPEC SFS® 2014 benchmark
RETIRED | The SPEC SFS 2014 benchmark was retired on December 31, 2021.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | Peak SPECsfs®2014_database #Databases or SPECsfs®2014_vdi #Desktops or SPECsfs®2014_vda #Streams or SPECsfs®2014_swbuild #Builds and the ORT (overall response time) or SPECsfs®2014_eda #JobSets |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. SPEC SFS2014 results for different workloads may not be compared. |
SPECstorage® Solution 2020 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | Peak and ORT for: - SPECstorage® Solution 2020_ai_image JOBS & ORT or - SPECstorage® Solution 2020_genomics JOBS & ORT or - SPECstorage® Solution 2020_vda STREAMS & ORT or - SPECstorage® Solution 2020_swbuild BUILDS & ORT or - SPECstorage® Solution 2020_eda_blended JOBS & ORT |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. SPECstorage Solution 2020 results for different workloads may not be compared. |
(Retired) SPEC Sip_Infrastructure® 2011 benchmark
RETIRED | The SPEC Sip_Infrastructure 2011 benchmark was retired May 31, 2015.
|
SPEC.org Submission Requirements | By location, as defined below |
SPEC Metrics | SPECsip_Infrastructure®2011 Supported Subscribers |
Required Metrics | SPECsip_Infrastructure®2011 Supported Subscribers |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECviewperf® 11 benchmark
RETIRED | The SPECviewperf 11 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | |
Conditionally Required Metrics | |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECviewperf® 12 (12.0, 12.01, 12.02) benchmark
RETIRED | The SPECviewperf 12 benchmark (versions 12.0, 12.01, 12.02) has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | |
Conditionally Required Metrics | |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECviewperf® 12.1 benchmark
RETIRED | The SPECviewperf 12.1 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. |
SPEC Metrics |
|
Required Metrics | |
Conditionally Required Metrics | |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECviewperf® 13 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECviewperf® 13 Linux Edition benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECviewperf® 2020 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | In addition to the requirements_ that results not be compared to other benchmarks
|
(Retired) SPEC VIRT_SC® 2010 benchmark
RETIRED | The SPEC VIRT_SC® 2010 benchmark was retired on February 26, 2014 in favor of its successor, the SPEC VIRT_SC® 2013 benchmark.
|
SPEC.org Submission Requirements | Results must be reviewed and accepted by SPEC prior to public disclosure. |
SPEC Metrics |
|
Required Metrics |
The required metric must be listed in close proximity to any other measured data from the disclosure or any derived value. |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | In addition to the requirements that results not be compared to other benchmarks:
|
SPEC VIRT_SC® 2013 benchmark
SPEC.org Submission Requirements | Results must be reviewed and accepted by SPEC prior to public disclosure. |
SPEC Metrics |
|
Required Metrics |
The required metric must be listed in close proximity to any other measured data from the disclosure or any derived value. |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | In addition to the requirements that results not be compared to other benchmarks:
|
SPECvirt® Datacenter 2021 benchmark
SPEC.org Submission Requirements | Results must be reviewed and accepted by SPEC prior to public disclosure. |
SPEC Metrics | SPECvirt®_Datacenter-2021 |
Required Metrics | SPECvirt®_Datacenter-2021 The required metric must be listed in close proximity to any other measured data from the disclosure or any derived value. |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | In addition to the requirements that results not be compared to other benchmarks:
|
(Retired) SPECweb® 2005 benchmark
RETIRED | The SPECweb 2005 benchmark was retired on January 12, 2012.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Not allowed |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
(Retired) SPECweb® 2009 benchmark
RETIRED | The SPECweb 2009 benchmark was retired on January 12, 2012.
|
SPEC.org Submission Requirements | By location, as defined below |
SPEC Metrics |
|
Required Metrics | If any data from a full disclosure is used, then one of the following must also be included:
|
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | In addition to the requirements that results not be compared to other benchmarks:
|
(Retired) SPECwpc® V1.0 benchmark
RETIRED | The SPECwpc V1.0 benchmark has been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. |
SPEC Metrics | Media & Entertainment, Product Development, Life Sciences, Financial Services, Energy, General Operations |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECwpc® V2.0 benchmark
RETIRED | The SPECwpc V2.0 and 2.1 benchmarks have been retired.
|
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. |
SPEC Metrics | Media & Entertainment, Product Development, Life Sciences, Financial Services, Energy, General Operations |
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECworkstation® 3 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECworkstation® 3.1 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SPECworkstation® 4.0 benchmark
SPEC.org Submission Requirements | None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. |
SPEC Metrics |
|
Required Metrics | None |
Conditionally Required Metrics | None |
Use of Estimates | Estimates are allowed if clearly identified. |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
SERT® 1.0 Suite
SPEC.org Submission Requirements | These requirements are defined by the energy efficiency agencies that specify the use of the SERT for their programs (e.g. the United States Environmental Protection Agency’s ENERGY STAR program). SPEC does not dictate these requirements. |
SPEC Metrics | There is no single metric for the SERT 1.x suite. |
Required Metrics | SPEC does not require a metric to be published for the SERT 1.x suite information. Agencies using the SERT 1.x suite for their energy efficiency programs may require specific information to be published. |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | In addition to the requirements that results not be compared to results from other benchmarks or tools:
|
SERT® 2.0 Suite
SPEC.org Submission Requirements | These requirements are defined by the energy efficiency agencies that specify the use of the SERT for their programs (e.g. the United States Environmental Protection Agency’s ENERGY STAR program). SPEC does not dictate these requirements. |
SPEC Metrics | SERT Efficiency Metric |
Required Metrics | SERT Efficiency Metric Agencies using the SERT 2.x suite for their energy efficiency regulatory programs may require specific information to be published. |
Conditionally Required Metrics | |
Use of Estimates | Not allowed |
Disallowed Comparisons | In addition to the requirements that results not be compared to results from other benchmarks or tools:
|
Chauffeur® WDK tool
SPEC.org Submission Requirements | Not applicable |
SPEC Metrics | Information generated with worklets running under this harness may not be treated as a SPEC metric and are not endorsed by SPEC. |
Required Metrics | Not applicable |
Conditionally Required Metrics | Not applicable |
Use of Estimates | Not applicable |
Disallowed Comparisons | No additional requirements beyond the requirements that results not be compared to other benchmarks. |
III. Definitions
- Basis for Comparison
- Information from a compliant result may be used to define a basis for comparing a subset of systems, including but not limited to memory size, number of CPU chips, operating system version, other software versions, or optimizations used. Other information, not derived from SPEC, may also be used to define a basis, for example, cost, size, cooling requirements, or other system characteristics. The basis must be clearly disclosed.
- By Location
For benchmarks designated as having a submission requirement "By location", these requirements apply:
Each licensee test location (city, state/province and country) must measure and submit a single compliant result for review, and have that result accepted by the technically relevant subcommittee, before publicly disclosing or representing as compliant any result for the benchmark.
After acceptance of a compliant result from a test location, the licensee may publicly disclose future compliant results produced at that location without prior acceptance by the subcommittee.
The intent of this requirement is that the licensee test location demonstrates the ability to produce a compliant result.
Note that acceptance of a result for one SPEC benchmark does not relieve a licensee of the requirement to complete the procedure for any other SPEC benchmark(s) that also require initial acceptance by location.
- Close Proximity
- In the same paragraph or an adjacent paragraph for written materials; or visible simultaneously for visual materials. The font must be legible to the intended audience.
- Compliant Result
- The set of measurements, logs, full disclosure report pages, and other artifacts that are the output of a process that follows the run and reporting rules of a SPEC benchmark. Depending on the benchmark and its rules, the process may have many steps and many ingredients, such as specific software, hardware, tuning, documentation, availability of support, and timeliness of shipment. To find the rules for a specific benchmark, click its name in the tables above.
- A number within such set that is labelled as a SPEC metric.
Note that benchmark reporting pages include other types of information, such as the amount of memory on the system. It is not allowed to represent such other information as a SPEC metric, although it may be used to define a Basis for Comparison.
SPEC reviews results prior to publication on its web site, but the accuracy and compliance of the submission remains the responsibility of the benchmark licensee. See the disclaimer.
- Derived Value
A unit that is a numerical function of one or more SPEC Metrics, rather than the original metric. The function may be a constant divisor, to normalize performance to a comparison system of interest. The function may bring in quantities that are some other characteristic(s) of the system. Such other characteristics may include information from both SPEC result pages and from non-SPEC sources.
Examples:
SPECint®_rate2006 per chip" (metric is divided by number of chips reported on SPEC disclosure)
"Cubic feet per SPECint®_rate2006" (a non-SPEC quantity is divided by the metric)
"Normalized SPECsfs®2008_cifs" (metric is divided by result for a comparison system)
"GamePerfMark", from the trademark section above.
This definition is intentionally broad, encompassing any function that includes a SPEC metric as one of the inputs.
- Disallowed Comparisons
- As mentioned above, results of one benchmark may not be compared to a different benchmark, nor to a different major release of the same benchmark. Individual benchmarks may forbid other comparisons, typically where such comparisons are considered inherently misleading.
- Estimate
An estimate is an alleged value for a SPEC metric that was not produced by a run rule compliant test measurement.
For purposes of this definition, it does not matter whether the alleged value for the metric was produced by extrapolating from known systems, or by cycle accurate simulation, or by whiteboard or dartboard, or by normal testing with the exception of a single missing mandatory requirement (e.g. the 3 month availability window). If the alleged value is not from a rule-compliant run, then it is an estimate.
The usage of estimates is limited.
- Major Release
- For purposes of this fair use rule, the term "major release" references a change in the year component of a benchmark product name, for example SPECjvm 98 vs. SPECjvm 2008.
- Non-Compliant Number
A value for a SPEC metric that fails to meet all the conditions for a compliant result.
Usage Note: By the definition of Estimate, above, a non-compliant number is also an estimate; and, of course, an estimate does not comply with the run rules. Therefore, the terms are sometimes interchangeable. In practical usage, an estimate may bear no relationship to any measurement activity; whereas a non-compliant number is typically the product of running the SPEC-supplied tools in a manner that does not comply with the run rules. In such cases, the tools may print out numbers that are labelled with SPEC metric units, but the values that are printed are not compliant results. Such values are sometimes informally called "non-compliant results", but for the sake of clarity, this document prefers the term "non-compliant number".
- Required Metric
- A SPEC metric whose value must be supplied. Individual benchmark sections above list whether they have required metrics. If so, then when any data is used from a full disclosure report, then the values for this/these metric(s) must also be used.
- SPEC Metric
- A unit of measurement defined by a benchmark, such as response time or throughput for a defined set of operations. The available units for each benchmark are named in the tables above, and are defined within the benchmark run rules (which can be found by clicking the benchmark name in the tables above).
Example: SPECjvm®2008 Peak ops/m.
- A specific value measured by such a unit.
Example: 320.52 SPECjvm®2008 Peak ops/m.
Usage Note: Both senses are used in this document, and it is expected that the sense is clear from context. For example, the prohibition against calling a derived value by a SPEC metric name is sense (i): do not define your own unit of measurement and then apply SPEC's trademarks to that unit. As another example, the rules for SPECpower_ssj® 2008 require disclosure of SPECpower_ssj®2008 overall ssj_ops/watt, which is sense (ii): one is required to supply the value measured for a particular system.
A printed SPEC metric value is not necessarily a Compliant Result: SPEC provides tools that display values for SPEC metrics, such as the above example of "320.52 SPECjvm®2008 Peak ops/m". Although SPEC's tools help to enforce benchmark run rules, they do not and cannot automatically enforce all rules. Prior to public use, the licensee remains responsible to ensure that all requirements for a compliant result are met. If the requirements are not met, then any printed values for the metrics are non-compliant numbers.
IV. Violations Determination, Penalties, and Remedies
SPEC has a process for determining fair use violations and appropriate penalties and remedies that may be assessed.
Change history
11 April 2011 - The SPEC Fair Use rule has been re-written to:
- Promote greater Fair Use consistency across SPEC benchmarks; and
- Where Fair Use rules differ among SPEC benchmarks, make it easier to find differences.
22 June 2011 - Editorial clarifications
- Emphasize that comparisons of non-compliant numbers must not be deceptive.
- Explain the term "major release" as used in the rule about comparisons.
- Clarify example for normalized historical comparisons.
- Clarify definition of Close Proximity.
- Prefer term "licensee" rather than synonyms.
- Minor editorial clarifications.
18 August 2011 - Add SPEC Sip_Infrastructure 2010. Correct the metrics list for SPECweb 2009.
7 February 2013 - Add SPECjbb 2013, SPEC OMP 2012.
25 February 2013 - Add SERT.
13 March 2014 - Add SPEC Accel, Chauffeur-WDK. Note retirement of SPECjAppServer 2004, SPECjbb 2005, SPECmail 2001 and SPECmail 2009, SPEC OMP 2001, SPEC VIRT_SC 2010, SPECweb 2005 and SPECweb 2009.
3 November 2014 - Add SPEC SFS 2014.
9 December 2014 - Note retirement of SPECjbb 2013
2 September 2015 - Note retirement of SPEC SFS 2008, correct SPEC SFS 2014 required metrics listing
23 September 2015 - Add SPECjbb 2015
05 May 2016 - Add SPEC Cloud IaaS 2016
27 September 2016 - Added recent versions and marked as retired older versions of SPECapc, SPECviewperf, and SPECwpc benchmarks.
27 March 2017 - Added SERT 2.0, structural rearrangement of document.
20 June 2017 - Added SPEC CPU 2017
27 June 2017 - Updated SPEC ACCEL with new metrics
07 November 2017 - Added SPECapc for Maya 2017, noted retirement of SPEC Sip_Infrastructure
19 December 2017 - Added new EDA workload and metric for SPEC SFS 2014 SP2
23 May 2017 - Added SPECviewperf 13, noted retirement of SPECviewperf 12.0 and SPECviewperf 12.1
15 August 2018 - Added SPECapc for Solidworks 2017
12 September 2018 - Added SPECjEnterprise 2018
19 October 2018 - Added SPECworkstation 3, noted retirement of SPECwpc V2.0/2.1
18 December 2018 - Added SPEC Cloud IaaS 2018
8 March 2019 - Added SPECviewperf 13 Linux Edition, added (retired) SPEC CPU 2000, updated SPEC CPU 2006 to note retirement
9 September 2019 - Updated for SPEC CPU 2017 V1.1
8 December 2020 - Updated for SPECstorage Solution 2020, SPECapc for Solidworks 2020, SPECviewperf 2020
16 March 2021 - Updated for SPECworkstation 3.1
02 September 2021 - Updated for SPECvirt Datacenter 2021
19 October 2021 - Updated for SPEChpc 2021
9 February 2023 - Retired SPECapc for Solidworks 2019, updated for SPECapc for 3dsmax 2020, SPECapc for Maya 2023, SPECapc for Solidworks 2021, SPECapc for Solidworks 2022
4 December 2024 - Updated for SPECworkstation 4.0, SPECapc for Solidworks 2024. Retired SPECapc for Solidworks 2020 and SPECapc for Solidworks 2021. Updated SPEC Cloud IaaS 2016 entry.