/tpu question

Chris Sharman Chris.Sharman@ccagroup.co.uk
Thu, 14 Jan 1999 09:03:22 +0000


>Here is how I have done my conversion.  Start with the old dictionary
>deffinition and write a QTP procedure to dump the data to a permanent
>subfile.  Modify the dictionary, create an empty version of the file and
>then write a QTP procedure to reload the file adding the '19'  or 19000000
>to the date fields (depending if character or numeric data originally).  The
>catch is to test that there is a date in the field.  SO the code is as
>follows:

Switch off here if you don't use VMS - sorry.

That will work, but it will be slow to load, badly tuned, and any indices
loaded out of order will be sub-optimal, and may have severe bucket splitting.
It's fine for direct or sequential files though.

Better for indexed is to unload to a permanent (sequential) subfile (as you
have done), but making the record changes on the way. Also Quiz does less
locking than QTP, so may be faster. Then modify the dictionary, and create an
fdl for the new file: either by creating an empty file and using $ anal/rm/fdl,
or by manually editing the old FDL. Then:

1$ conv/fdl=<newfdl> <data.sf> <trial.tmp> [/nosort if sf in primary key order]
2$ anal/rm/fdl <trial.tmp>
3$ edit/fdl/nointeractive/gran=double/anal=<trialfdl> <trialfdl>
4$ delete <trial.tmp;>
5$ conv/fdl=<trialfdl> <data.sf> <data.dat> [/nosort if sf in primary key order]

This will result in a well-tuned file, with all indices optimally loaded.
You should put the trial data file on a disk with the same cluster factor as
the data disk, otherwise the bucket sizes picked may be inappropriate.

Steps 2-3 are tuning, and are worth doing any time the data changes
significantly in content or number of records (can be done from the live file).

Step 5 is optimisation, and is worth doing fairly regularly (I do it every
Sunday for most active files).

The Powerhouse defaults are not at all good on our system: adequate for
trivial files of less than a hundred records maybe, but I'd recommend the
practice above for anything bigger. The sub-optimal indexing is an inevitable
consequence of loading records one by one into RMS, rather than a fault of QTP,
which is why Digital provided convert etc to clean up files.

Regards,
Chris
______________________________________________________________________
Chris Sharman			Chris.Sharman@CCAgroup.co.uk
CCA Stationery Ltd, Eastway, Fulwood, Preston, Lancashire, PR2 9WS.
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
Subscribe: "subscribe powerh-l" in message body to majordomo@lists.swau.edu
Unsubscribe: "unsubscribe powerh-l" in message to majordomo@lists.swau.edu
powerh-l@lists.swau.edu is gatewayed one-way to bit.listserv.powerh-l
This list is closed, thus to post to the list, you must be a subscriber.