The Great Web Way:

The GPC News Publishing Procedures


These are the publishing procedures for the GPC News as of February 17, 1997:

1. Two weeks before vendor submissions are due, Cramblitt & Company sends an e-mail reminder to GPC, OPC, PLB and XPC aliases.

2. One week before vendor submissions are due, Cramblitt & Company sends another e-mail reminder to the committees.

3. Cramblitt & Company begins working on editorial updates and changes to the GPC News.

4. All submissions are uploaded to SPEC Member Website by the close of business on the submission date. The directory in which the submissions are uploaded is:

/home/stuff/gpc/[X]/incoming

Where [X] is one of the following: plb, opc or xpc

The file can be either .zip or .tar.Z format, but must have the following tree structure:

Figure 1.1:

For PLB and XPC submissions:

plb.data (or xpc.data)

|

[Member Name ie. IBM]

|

________/ \__________

/ \

rawdata summary

For OPC submissions:

opc.data

|

[Member Name ie. IBM]

|

_______/ \________

/ \

rawdata summary

/ | \ / | \

[test1][test2]... [test1][test2]...

| | | |

[rawoutput][rawoutput] [summary_files][summary_files]

Where [test1], [test2], etc. can be one of the following: AWadvs, Light, CDRS, DX, or DRV.

Under the rawdata directory, the actual output from the test runs should be stored. Vendors can store the raw data in any format they wish as long as all the data for the submitted tests are there. Vendors can use any naming convention they want for the files under the summary directory. The tests, however, will be published alphabetically.

5. Either the same day or one day after IBM has converted the submissions to HTML, Kathy Powers of SPEC assigns peer reviewers and posts an e-mail with the assignments. The data is reviewed by the members in the following fashion: Member A reviews Member B's data, Member B reviews Member C's data, and Member C reviews Member A's data.

If a vendor has no new submissions, the vendor's old numbers will be carried over automatically to to the next update, unless the vendor's representative notifies the project group to discontinue publishing those numbers. Vendor representatives who want to publish a combination of new and previously published numbers, must submit all numbers within a single package.

6. After the submission deadline, a preliminary version of the web pages are created using only the newly submitted data. This "review" version is put on SPEC Member Website (in /home/stuff/gpc/<SUBMISSION DATE>.

7. Once the submission data has passed the review procedure, the submissions from all three project groups are downloaded to (currently) IBM's local machine. After the submissions are downloaded, the data is ready to be converted into HTML (the language used to create web pages). Although the directory structure is important to the scripts, the location of the entire structure within the file system is variable. There should be a directory that contains the three data directories: plb.data, xpc.data, and opc.data. These directories should be in the same format as when they are submitted (figure 1.1).

There are a series of tools that work together to produce three new trees: plb, xpc, and opc. These tools are described below:

mkrelease tars up the HTML, data, and static directories. (This script gets changed a lot.)

rebuild builds the opc, plb, and xpc directories by calling opc, plb, and xpctree2HTML.

opc2html.pl converts the OPC summary file on stdin to HTML on dtdout. Puts one line summary on stderr.

opcsum2html.pl Takes file of one-line summaries for OPC and creates HTML tables sorted by price, vendor, and performance.

opctree2html.pl Traverses the opc.data tree and calls opc2html.pl on the summary files to create the opc directory of HTML files.

plb2html.pl Converts the PLB summary file on stdin to HTML on stdout. Puts one-line summary on stderr.

plbsum2html.pl Takes file of one-line summaries for PLB and creates HTML tables sorted by price, vendor and performance.

plbtree2html.pl Traverses the plb.data tree and calls plb2html on the summary files to create the plb directory of HTML files.

xpc2html.pl Converts the XPC summary file on stdin to HTML on stdout. Puts one-line summary on stderr.

xpcsum2html.pl Takes files of one-line summaries for XPC and creates HTML tables sorted by price, vendor and performance.

xpctree2html.pl Traverses the xpc.data tree and calls xpc2html.pl on the summary files to create the xpc directory of HTML files.

Creating the HTML directories for the three project groups should be as simple as running the rebuild script from the root of the tree (see figure 2.1). The other tools should be in the PATH environment variable.

Figure 2.1:

| [root of tree]

_______________/|\________________

/ | \

plb.data xpc.data opc.data

| | |

[Member name] [Member name] [Member name]

___/\___ ___/\___ ___/\___

/ \ / \ / \

rawdata summary rawdata summary rawdata summary

/\ /\

[test] [test] [test][test]

Once all the HTML files have been built, a copy is tar'd and compressed. This .tar.Z is then uploaded to SPEC Member Website and put into a new directory called /home/stuff/gpc/<MonthYear>.

For example:

/home/stuff/gpc/December96

In this directory the tar.Z is expanded. The directories that should be present are listed in Table 3.1. In addition, an index.HTML file should be present to point into these directories.

Table 3.1:

Dir. Name Description

opc OPC HTML pages

opc.static OPC group information pages (not created w/gpctools)

plb PLB HTML pages

opc.static PLB group information pages (not created w/gpctools)

xpc XPC HTML pages

xpc.static XPC group information pages (not created w/gpctools)

publish General GPC group information pages (not created w/gpctools)

image picture directory

TIP: To compress and tar a directory and/or file, do the following:

tar -cvf - [filename] [filename2] | compress > filename.tar.Z

For example, to compress and tar up the HTML pages after they've been built, execute this:

tar -cvf - ocp xpc plb | compress > prebuilt.tar.Z

To uncompress, and de-tar a file use the following command:

uncompress < filename.tar.Z | tar -xvf -

This will take filename.tar.Z and recreate its contents in the present working directory.

Once the directories are set up on SPEC Member Website, the file access modes need to be changed to allow web browsers to access them. All files and directories should be given "read" and "execute" access. The easiest way to do this is to use the "find" command piped to xargs and chmod. Like this:

find . -print | xargs chmod ugo+rx

Execute this from the top of the tree (the directory made to pull everything into pro). Once this is done, Cramblitt & Company should be notified and they will notify the project groups.

8. Vendors are allowed approximately eight days to review the submissions and bring any discrepancies to the attention of the submitter. Vendors are also expected to fix any of these discrepancies and re-submit the 'fixes' during this time period.

9. During the review period, Cramblitt & Company e-mails to IBM in HTML format any additions, changes or corrections to editorial pages.

10. Approximately five days are allowed for a challenge period in which vendors can challenge any of the submissions.

11. Approximately seven days are allowed for an appeal period in which vendors can appeal any challenges that have been made to a submission.

12. Following the appeal period, all the resubmissions from the previous issue are integrated into the .data directories, and the release is rebuilt. This rebuilt version is the installed onto pro using the same procedure as when the preliminary version was installed (note: the top directory does not need to be re-made because it is already there.) The old subdirectories (those listed in Table 3.1) are usually deleted and replaced by the new version.

13. IBM integrates editorial pages and report pages. The resulting "electronic proof" of the updated GPC News is posted on SPEC Member Website one day after the deadline for the appeal period. Cramblitt & Company reviews the proof, coordinates revisions with IBM, then sends an e-mail message to the project groups to notify them that the GPC News is available for review on pro.

14. Project group members are given two days to review the GPC News on pro and respond with any comments. As any changes are made, the "rebuild, transport" process will be repeated over and over.

15. IBM and Cramblitt & Company coordinate changes to the electronic proof. When all revisions are made, Cramblitt & Company conducts a final review and requests that IBM moves the GPC News from pro onto open.specbench.org/gpc. Cramblitt & Company then sends an e-mail message to the project groups notifying them that the GPC News is now available to the public.

[Table of Contents] [GPC Home] [OPC Project] [PLB Project] [XPC Project]