No subject
Mike Palandri
palandri@4j.lane.edu
Tue, 20 Apr 1999 15:26:52 -0700
At 03:56 PM 4/20/99 -0600, you wrote:
>Hello,
>
>I'm trying to use Qtp to delete duplicate records in a detail dataset
>(HP3000 TurboImage, MPE 5.5, PH 7.29c5). The data looks like this:
>
>id exam-type score-type test-date %score
>raw-score pass-fail
>1097 ENGL 1 04/15/99
>77 77 P
>1097 ENGL 1 04/15/99
>77 77 P
>1097 ENGL 1 04/15/99
>77 77 P
>1097 ENGL 2 04/15/99
>61 61 P
>1097 ENGL 2 04/15/99
>61 61 P
>1097 ENGL 2 04/15/99
>61 61 P
>
>etc. There should be only one record for each ID, Exam-Type,
>Score-type, Test-Date, Score, Raw-Score, and Pass-fail combination, but
>as it now stands, there are 8 or more.
>
>Here's the QTP we tried to use to delete all but the first record.
>
>acc TEST-SCORES
>sort on ID on EXAM-TYPE on SCORE-TYPE on DATE-TEST on SCORE &
> on RAW-SCORE on PASS-FAIL
>set process lim 80000
>set lock file update
>temp KNT
> item KNT count reset at PASS-FAIL
>output TEST-SCORES del if KNT > 1
>subfile tempsub include ID, KNT, EXAM-TYPE, SCORE-TYPE, DATE-TEST, &
> SCORE, RAW-SCORE, PASS-FAIL
>go
>
>What happened was that _all_ of the records in the TEST-SCORES dataset
>were deleted. Afterward, the tempsub subfile looks like this:
>
>id knt exam-type score-type test-date
>%score raw-score pass-fail
>1097 1 ENGL 1 04/15/99
>77 77 P
>1097 2 ENGL 1 04/15/99
>77 77 P
>1097 3 ENGL 1 04/15/99
>77 77 P
>1097 1 ENGL 2 04/15/99
>61 61 P
>1097 2 ENGL 2 04/15/99
>61 61 P
>1097 3 ENGL 2 04/15/99
>61 61 P
>
>The count field looks like it's incrementing properly, so we thought
>that records 2 and 3 from each control group would be deleted, leaving
>us with record 1 in the dataset.
>
>Why did Qtp delete all the records in the dataset? Am I missing
>something obvious?
>
>Ponderingly yours...
>----------------------------------------------------------------------
> John MacLerran
> IT Systems Analyst email: macljohn@isu.edu
> Idaho State University V(208) 236-2872
> http://www.isu.edu/~macljohn F(208) 236-3673
>----------------------------------------------------------------------
>
>
>= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
>Subscribe: "subscribe powerh-l" in message body to majordomo@lists.swau.edu
>Unsubscribe: "unsubscribe powerh-l" in message to majordomo@lists.swau.edu
>powerh-l@lists.swau.edu is gatewayed one-way to bit.listserv.powerh-l
>This list is closed, thus to post to the list, you must be a subscriber.
>
Hi,
This is sort of the brute force approach, but should work for you. Thie
first request saves one of each duplicate to a subfile and deletes all
records in TEST-SCORES, the second request loads the contents of the
subfile back intot TEST-SCORES.
run no-dupes
request no-dupes-unload
acc TEST-SCORES
sort on ID on EXAM-TYPE on SCORE-TYPE on DATE-TEST on SCORE &
on RAW-SCORE on PASS-FAIL
subfile tempsub at PASS-FAIL keep include TEST-SCORES
output TEST-SCORES delete
request no-dupes-load
access *tempsub
output TEST-SCORES add
set process lim 80000
set lock file update
build
Mike