Perofrmance Issue when processing large number of records in a PageList
Hi, I am processing almost 3000 records via excel upload. In post processing, I have to perform some validation (I have to make sure that combination of 3 properties should be unique) for which I am using IsInPageListWhen function. But the problem is because of the large number of records, I am having performance issue since first I am looping on those 3000 records and each row's data I am validating again using function in the entire PageList which is giving me performance issues.
You can use pyRemoveDuplicatesFromPagelist function on this page list. Once you have removed the duplicates you can determine if all entries were unique or not by checking the pagelist count.
This function takes following Parameters :
1.) PageListName - name of the source page list
2.) DuplicateCheckProperties - enter the name of properties separated by comma(eg if we want to say a entry in pagelist is duplicate based on two properties say Prop1 and Prop2 then param value will be "Prop1,Prop2" ). If only one property (say Prop1) then param will be Prop1 only.
Posted: 2 years ago
Posted: 15 Apr 2020 21:27 EDT
Abhinav Chaudhary (Abhinav_Chaudhary)
Senior Pega Developer
Thanks @DevangS for your valuable input. I do thought of this approach but in my use case business doesn't want me to remove the duplicate records and they want to display an error message with the record details and record index.