QTP Performance Issue
Ron Burnett
ron@cryptic.rch.unimelb.edu.au
Tue, 10 Jul 2001 08:24:51 +1000
Hello everyone,
I have a really serious performance issue with QTP running on MPE/iX
(an HP928 with 383 MB memory).
I have a subfile of 240 MB (containing just under 100,000 records),
which I link to a trivial-sized CM KSAM reference table to convert
a particular code value. The pseudo-code is
access <subfile> link <old-code> to <old-code> of <ksam-file> optional
define t-new-code char * n = new-code of <ksam-file> &
if record <ksam-file> exists else "999"
output <subfile> update
final <new-code> <t-new-code>
set lock file update
Now, this process takes over 10 hours on an otherwise unoccupied system!
I suspect there is a great deal of memory thrashing going on because of
the size of the file and the total available memory on the machine.
I also suspect that the 'lock' statement may be inefficient, so I removed
it because I can have exclusive access to the system for this purpose,
and can guarantee that there will be no other access to my data structures
or dictionary when this process is executing. But its still taking an
unacceptably
long time to run.
Any ideas on improving the run time? I've got around 900,000 records to
process and this is only one of nine steps to complete. I can cut the
individual
process 'chunk' to around 50,000 records. Or would converting the CM KSAM
table to NM KSAM help? Or is there a 'lock' statement that would be
significantly more efficient, given the exclusive-use circumstances?
Where's my supercomputer?
Cheers,
Ron Burnett
ron@cryptic.rch.unimelb.edu.au