Question
![](/profiles/pega_profile/modules/pega_user_image/assets/user-icon.png)
Acel Solutions Pvt Ltd
IN
Last activity: 5 Jun 2023 1:10 EDT
Bulk commit reading large files and processing it.
HI,
I have requirement of reading large file and process it and update multiple tables. How can we avoid multiple commits at each row.
Basically at the end of every month we get large file in csv format around 5 million record. I am using Data flow, inside it Data set to read the record, then we apply Business rules and update multiple table say almost 5 different tables. we have written activity after the data set(which reads the file) ,in the activity I am doing obj-save and commit.
is it suggested to commit like this or what options we have? anyone please suggest