Q. Why was the APC project group formed?
A. Within the Graphics Performance Characterization (GPC)
group there was a strong belief that it is important to benchmark
graphics performance based on actual applications. Application-level
benchmarks exist, but they are not standardized and they often do not
fully represent the graphics functionality required by the actual
application. The APC group feels that end users will benefit from a
broad-ranging set of standardized benchmarks for graphics-intensive
applications.
Q. What companies are members of the APC project group?
A. Current members include Compaq, Digital Equipment Corp.,
Dynamic Pictures Inc., Hewlett-Packard, IBM, Intel, Intergraph, Real
3D, Silicon Graphics and Sun Microsystems.
Q. Is it appropriate that vendors should drive this
effort?
A. Industry vendors have the highest level of interest in
developing these benchmarks, but the APC group's development process
stems from interaction with user groups, publications, application
developers and others. There is sometimes a perception that
"vendor-driven" benchmarks are less objective. In fact, these types
of consortium-developed benchmarks are likely to be the most
objective, since there is a natural system of checks and balances due
to the specific market interests of different vendors.
Q. Why did the APC project group decide to join the GPC
Group?
A. This initiative stemmed from members of the GPC Group
seeing a common need for better application benchmarks. The GPC
Group, with its affiliation with SPEC (Standard Performance
Evaluation Corp.), already has systems in place for running such a
project and ensuring standardization of the workloads, measurement
criteria, review processes, and reporting formats. This saves a great
deal of administrative time and resources, allowing the project group
to concentrate more fully on developing and maintaining
benchmarks.
Q. How is the group identifying applications in various
market segments?
A. We are using existing market data, member company
knowledge, and ISV input to determine relevant market segments and
important applications in those segments. These applications will be
updated over time as user needs change and new applications become
available.
Q. Will the application benchmarks be required to run on all
platforms?
A. They should run on a reasonable number of platforms, but
they do not have to run on all platforms. Many applications
are targeted to a specific level of hardware, such as PCs or high-end
workstations. It would not be appropriate to require a vendor to run
a benchmark that is not designed for its platform.
Q. How is the group identifying appropriate workloads for
selected applications?
A. APC project group members are sponsoring applications
and working with end users, user groups, publications and ISVs to
select and refine workloads, which consist of data sets and benchmark
script files. Workload development is being driven by end users and
ISVs, not vendors. Workloads will evolve over time in conjunction
with end users’ needs.
Q. What are the APC project group’s priorities in selecting
benchmarks?
A. Number one is to select benchmarks that are useful to
end users who are evaluating and selecting platforms. Secondly, the
applications must use graphics prominently in at least some phase of
user interaction. The workload will be selected to capture this
phase. There are several existing application benchmarks that are
called "graphics-oriented, " but whose test results do not reflect
the benefits that come from improvements vendors are making in raw
graphics performance. The APC group focuses on benchmarks that tax
the graphics subsystem, including hardware, software and the OS.
Q. How will the group define, publish, measure, report and
review results?
A. This is where our GPC Group affiliation pays benefits.
We will adopt existing GPC Group processes as much as possible to
minimize work and reduce the learning curve. Performance measurements
must be accurate and repeatable across diverse computing
environments. Reporting schemes will be designed to meet end users’
needs.
Q. How will APC results be verified and published?
A. Again, we will take advantage of standardized practices
used within the GPC Group. We intend to develop a review process, and
a set of universal reporting and display tools for publishing results
in this Web site.
Q. Will benchmark tools be freely available to the
public?
A. Yes. We will make certain a wide audience has easy
access to the workloads and the benchmark results published by the
APC project group. These tools will be made available for free
downloading through this Web site, which will allow independent
reproduction of results and analysis of workloads. Obviously, anyone
who wants to run the benchmarks will need copies of the
applications.
Q. Applications often support different features on
different platforms. This is especially true on the high end, where
vendors seek to add value by providing advanced features. How will
the APC group accommodate advanced feature sets in its application
benchmarks?
A. The project group wants to make certain that we avoid "
least-common-denominator" tests. We are developing benchmarks based
on "core" functionality, which represents the end users’ "musts" for
running the application. All vendors reporting on a benchmark must
support this functionality, although it is the reporting vendor’s
choice about whether to implement the functions through hardware or
software. In addition to core functionality, we expect to provide an
" enhanced" aspect of the benchmark. The enhanced aspect would make
it clear that there is extended functionality and performance
available that has value to end users. An example of enhanced
functionality might be the ability to do 3D texture mapping.
Q. Where is more information about membership and
application benchmarking available?
A. Information is available through the APC project group’s e-mail alias: gpcapc-info@spec.org.