I have a requirement where in a large file needs to be received in PEGA, parse each line of the file and the case needs to be created. This file has approximate size of 50MB and expected to contain more than 500K lines. Each line has 500 chars with approximately 50 words to parse. Also this process needs to be completed in 8 hours. Can a PEGA listner capable of processing such a huge volume?
If your file is really 50MB, wouldn't 100,000 lines (records) be a more accurate estimate assuming 1 char = 1 byte?
In any case, Pega can certainly handle large files and volumes. However, you will have better overall throughput if you can split the file in a way so as to have multiple listeners (on multiple nodes) to processing the data in parallel.
Posted: 6 years ago
Posted: 14 Dec 2015 9:26 EST
Raviteja Annam (RavitejaA)