QUICK procedure problem.
pickerij@norbord.com
pickerij@norbord.com
Tue, 23 Feb 1999 17:32:47 -0500
Yikes!! (technical term expressing shock and dismay :-)
The thought of reading sequentially through 100,000 records with Quick
makes me gasp. Sounds like something one of those big expensive
proprietary RDBM's would do behind your back :-)
But then I look at your code and you seem to be doing the "while
retrieving" using a key (MYKEY). The next question is "how many records
have the same value for THISKEY?" If the answer is "lots" (as defined by
your patience and machine size :-) then this task really should be taken
off line and given to Qtp.
The next problem is your apparent loss of chain info. Just how is Quick
supposed to know what to do with MYFILEALIASED? You should read the
record in this file via a *unique* key using the value from the designer
file. And the designer file really should have a select to discard
records for which the value of "NONKEYFIELD" is not "XYZ" so you don' t
keep reading forever.
Although you indicate that you are updating NON key data, the only reason
to use 2 files with a put and a delete is because you are actually
updating key information or information on which keys depend such as sort
fields or even extended sort fields -- back to the Image manual with you!
If it was truly an update on NON key data then one file would be
sufficient.
Sigh, so many to teach, so little time,
John Pickering
Toronto
-----Original Message-----
From: shuckvale@cosworth-racing.co.u
Sent: Tuesday, February 23, 1999 4:37 PM
To: powerh-l@lists.swau.edu
Subject: QUICK procedure problem.
----------------------------------
All,
Can anyone help me with a piece of QUICK screen code (current version
7.09E
on MPEix) which is causing no end of problems to me. Currently I have
several screens running a while retrieving sequential construct to
perform
updates on selected records from an IMAGE dataset with over 100,000
records.
These screens were fine in the early days when the dataset in question
was
smaller but now they're a major drain on CPU time which I need to plug.
My
current (non-)solution is the code below;
********************************************************
file MYFILE designer open 1
file MYFILE alias MYFILEALIASED open 2
...
procedure internal UPDATEMYFILE
begin
while retrieving MYFILE viaindex MYKEY using THISKEY
begin
let NONKEYFIELD of MYFILEALIASED = "XYZ"
put MYFILEALIASED reset
delete MYFILE
put MYFILE
end
end
********************************************************
This seems to work fine until the 'while retrieving' loop hits the
records
added via the alias and (I think) loops infinitely. Without the alias I
only get to update one record, as soon as I PUT to the dataset the record
pointers get lost and the while retrieve construct falls through.
My apologies in advance if this is elementary stuff, I'm afraid it's
inherited code and whilst I've done the 'Intro to Powerhouse' course I'm
still waiting to get on 'Advanced Part 1' which I believe covers QUICK
quite
comprehensively. In the meantime I'd be most grateful if somebody could
point me in the right direction towards a solution.
Steve Huckvale
Cosworth Racing Ltd.
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
Subscribe: "subscribe powerh-l" in message body to majordomo@lists.swau.edu
Unsubscribe: "unsubscribe powerh-l" in message to majordomo@lists.swau.edu
powerh-l@lists.swau.edu is gatewayed one-way to bit.listserv.powerh-l
This list is closed, thus to post to the list, you must be a subscriber.