Skip navigation

Standard Performance Evaluation Corporation

Facebook logo LinkedIn logo Twitter logo
 
 

SPEC Blog

Screenshot of a model in the SPECapc for 3ds Max 2020 benchmark
New SPECapc® for 3ds Max Benchmark for Systems Running Autodesk 3ds Max 2020

By Trey Morton, SPECapc Committee Chair

Earlier this year, the SPEC® Application Performance Committee (SPECapc) released the SPECapc for 3ds Max 2020 benchmark for systems running the latest version of Autodesk 3ds Max. This version of the SPEC benchmark replaces the SPECapc for 3ds Max 2015 benchmark.

With 3ds Max 2015 retired, the Autodesk community needs an industry-standard benchmark for an updated version of the application. 3ds Max is recognized as a key part of gaming, visualization, architecture, and many more industry segments, and SPEC realizes the importance of continuing to be able to measure performance without gaps. The SPECapc for 3ds Max 2020 benchmark is an interim release that enables the community to continue benchmarking their systems while SPEC launches a ground-up redesign of the benchmark software with new content. As with all of the SPECapc benchmarks, we are always on the lookout for community content that could be utilized in future versions of the benchmark.

Read more

 

Screenshot of a model in the SPECviewperf 2020 V3 benchmark
SPEC Releases SPECviewperf 2020 v3.0 Benchmark, Adds Linux Edition

By Ross Cunniff, SPECgpc Chair

The SPEC® Graphics Performance Characterization (SPECgpc) group put tremendous effort into updating the SPECviewperf® benchmarks over the last year, and I'm excited today to discuss some of the new features. The SPECviewperf® 2020 v3.0 benchmark and the SPECviewperf® 2020 v3.0 Linux Edition benchmark - industry-standard benchmarks for measuring graphics performance based on professional applications - measure the 3D graphics performance of systems running under the OpenGL and DirectX application programming interfaces. The benchmark workloads, called viewsets, represent graphics content and behavior from actual workstation-class applications, without the need to install the applications themselves.

Read more

 

Desktop calendar; Photo by Eric Rothermel on Unsplash
SPEC 2021 Year in Review

By David Reiner, President

As all of us at SPEC continue into a very busy 2022, I'd like to reflect on what we accomplished during 2021, which, despite the continued headwinds from the pandemic, was an exciting and productive year at SPEC.

Read more

 

Computer screen with multi-colored code; Photo by Markus Spiske on Unsplash
Searching for Workloads — Help Shape the Next SPEC CPU Benchmark Suites

By James Bucek, SPEC CPU Chair

SPEC exists to develop benchmarks that vendors, businesses, and individuals can trust to make key purchasing decisions based on fair comparisons between different systems or solutions. One way we do this is bringing together representatives from a variety of companies, including competitors, to design benchmarks that fairly represent real-world workloads.

For SPEC CPU, an equally important aspect of creating better benchmarks is that we base them on actual applications or workloads in use today, not just representative workloads. This separates SPEC CPU from many of the common micro-benchmarks and gives a user more confidence in the workloads behind the benchmark, as well as greater assurance that the results produced are standard and reproducible, allowing for more accurate comparisons between solutions.

Read more

 

Illustration of the five workloads that comprise a tile
New SPECvirt® Datacenter 2021 Benchmark, for Complex, Multi-Host Environments

By David Schmidt, Virtualization Committee Chair

The SPEC Virtualization Committee has released the SPECvirt® Datacenter 2021 benchmark, a new multi-host benchmark for measuring the performance of a scaled-out datacenter.

While distributed computing has become the dominant infrastructure in datacenters today – providing reliability, availability, serviceability and security – virtualization is the key to optimizing this infrastructure and providing increased flexibility and application availability, while also reducing costs through server and datacenter consolidation. As such, suppliers and buyers require a fair, vendor-agnostic tool for measuring the performance of the solutions they use to power these more complex, multi-host environments.

Read more

 

SPEC SERT Suite performance bar chart
Server Performance — At what cost?

By Klaus Lange, SPECpower Committee Chair

From their hardware and firmware, to their software and applications, modern servers are increasingly complex. When one considers the incredible variety of computational and data-centric tasks that servers are designed and deployed to provide, the provisioning of these machines cannot be done properly or completely without accounting for energy efficiency. Beyond the initial hardware and software investments, beyond the costs to house these servers, what level of power-efficient performance can be expected and how does that compare across different architectures and configurations?

Read more

 

SPECviewperf 2020 splash screen
SPECviewperf® 2020 v2.0, Keeping Pace With an Evolving Industry

By Ross Cunniff, SPECgpc Chair

The Graphics and Workstation Performance Group (GWPG) proudly announced the SPECviewperf® 2020 v2.0 benchmark in June, just eight months after the initial release of the 2020 version. In addition to the time we invested in ensuring SPECviewperf keeps pace with our evolving industry, SPECviewperf 2020 2.0 truly reflects the best of SPEC’s ability to leverage competing members to create and update a benchmark that benefits everyone, the vendors, enterprise customers and end users alike.

Read more