Submitting SPEC Benchmark Results
Why submit results to SPEC?
SPEC's mission is to develop benchmarking standards and the corresponding software tools for fairly evaluating modern computer performance, and to provide an open forum for the results. SPEC invites you to help populate its public database of benchmark results. Licensees of the SPEC benchmarks may run the tests and submit results to SPEC. SPEC reviews the results and publishes them on its website for a processing fee. Having more results published on the SPEC website means having more information available to the community to make better comparisons. Help us level the playing-field; submit your results to SPEC.
Who may submit results and how much does it cost?
Any individual or group licensed with SPEC software may submit results to SPEC. There is a publication fee for each result submitted to SPEC and payment is required prior to publication of a reviewed result. SPEC members and associates may submit an unlimited number of results for no charge, one of the many benefits afforded by membership in SPEC . For GWPG (SPECapc, SPECviewperf, SPECworkstation) results, there is a limit of eight non-member submissions per company over a 12-month period. If you expect that you will be submitting a large number of results in a one year period of time, you should consider joining SPEC as a member or associate. For more info on SPEC membership, contact us!
Benchmark | Versions Accepted | Publication Fee |
---|---|---|
SPEC Cloud IaaS 2018 | 1.0, 1.1 | $500 |
SPEC CPU 2017 | 1.1, 1.1.5, 1.1.7, 1.1.8, 1.1.9 | $500 |
SPECjbb 2015 | 1.04 | $500 |
SPECjEnterprise 2018 Web Profile | 1.0.0 | $500 |
SPECjEnterprise 2010 | 1.03 | $500 |
SPECjvm 2008 | 1.0 | $500 |
SPECpower_ssj 2008 | 1.12 | $500 |
SPECstorage Solution 2020 | 1.0 | $1000 |
SPECvirt Datacenter 2021 | 1.0 | $500 |
SPEC VIRT_SC 2013 | 1.0, 1.1 | $500 |
SPECaccel 2023 | 2.0.17 | $2000/$500 ** |
SPEC ACCEL | 1.1, 1.2, 1.4 | $2000/$500 ** |
SPEChpc 2021 | 1.03, 1.1.7 | $2000/$500 ** |
SPEC MPI 2007 | 2.0, 2.0.1 | $2000/$500 ** |
SPEC OMP 2012 | 1.0 | $2000/$500 ** |
SPECapc for 3ds Max | 3ds Max 2020 | $1000 |
SPECapc for Creo | Creo 9 | $1000 |
SPECapc for Maya | Maya 2024 | $1000 |
SPECapc for Solidworks | Solidworks 2024, 2022 | $1000 |
SPECviewperf 2020 | 3.1 | $1000 |
SPECworkstation | 4.0 | $1000 |
** The SPECaccel, SPEChpc, SPEC MPI and SPEC OMP benchmark result publication fee for non-profit organizations is $500.
How to submit results?
Results are submitted to SPEC via email. Results may be submitted at any time on any day of the week, but they are reviewed and published on a specific schedule. Each benchmark result undergoes a two (2) week review/publication cycle according to the following calendar.
In general, the result or raw result file should be submitted to SPEC as an attachment to an email to the appropriate benchmark submission drop. GWPG results are uploaded as a zip file containing all tested configurations for a particular benchmark. For a list of result submission email drops and/or the GWPG result upload location, please send mail to info@spec.org.
Additional and specific information on submitting results can be found in each benchmark's run and reporting rules and user guide documentation.
Submitting results and security mitigations
From time to time there can be reports of security mitigation concerns. SPEC's current approach with regard to security mitigations can be seen here: Security Q and A. Specific questions about submission can be addressed to: info@spec.org.
Publication of results
Once submissions complete their review and are considered accepted for publication at SPEC.org, they will be published (typically within one US business day) on www.www.org with all the other results for the same benchmark. See http://www.spec.org/results for a listing of where results are published.
Results for currently supported benchmarks are tagged with the date that result was first published at SPEC.org. This tagging is commonly implemented as an "Originally published" string as part of the footer information on each result page. Note: not all result page formats support post-processed footer updates; however, at least the HTML format for each result does support this tagging. If no such date string is seen in a specific result file, check the HTML version of the same result. Several SPEC benchmarks have explicit rules based on a "Date Published" definitions, and in those cases the result file incorporates the "Date Published" into the main content of the result page rather than just included as part of the page footer.
More info?
To better understand the SPEC result review process, please consult the SPEC/OSG Policy Document, Section: Guidelines for Result Submission and Review.
For any organization or individual who chooses to make public comparisons using SPEC benchmark results, please be sure to read and follow the Fair Use Policy. If you wish to make academic or research use of SPEC HPG benchmark results, please review the recommendations in the document, "Guidelines for the Use of SPEC HPG Benchmarks in Research Publications" .