We are migrating a legacy client/server application (written in PowerBuilder) into the Pega platform (currently version 8.1.1). Our concern is the possibility of needing to handle very large data pages. "Very large" would consist of perhaps upwards of 70,000 records. It would not be a typical to be returning 5-10K records fairly routinely. In addition, the user community for this application is over close to 1,000 users (only about half concurrently).
While we would using paging (whether normal or progressive) to only present a limited number of records to the end user GUI, the data page fetch (via a connect rest service) would likely pull all the data into the clipboard. This current approach is memory intensive.
My questions are:
1. Is there a prescribed pattern that Pega recommends for handling and presenting large data pages of data to the end user?
3. Are there alternative PEGA design strategies that have worked for others when dealing with large data sets?
The only technical constraint is that all data interaction are via Connect REST services. We currently use no Report Definitions because this data isn't being stored in the Pega data tables.