Question
amazon.com
US
Last activity: 21 Jun 2016 17:09 EDT
is the pega database java api using hibernate under the hood?
we have some custom code to read from a spreadsheet and insert in batches of 1000 rows.
the performance gets progressively worse the more batches we insert.
when we invoke the code again (say to upload a second file) the performance returns to normal for the first few batches but then slowly degrades just as before.
we have heard that this kind of behavior can be related to the way which hibernate manages memory...
Message was edited by: Lochan to add Category
-
Like (0)
-
Share this page Facebook Twitter LinkedIn Email Copying... Copied!
Accepted Solution
amazon.com
US
we posted our code and as a bench mark we wrote some pure JDBC code with no hibernate at all and the results were stunning...
the PURE JDBC code took about 4 seconds to insert 64k rows in batches of 1k.
Our Pega Database API code takes about 18-20 minutes to insert the same 64k rows using save deferred.
Pure JDBC (no DAO/HIBERNATE):
logger.error("start get case"); int rowCount = 1; try (ResultSet rs = ps.executeQuery();){ //Convert the result set into a the creaturePOJO list object... //Commented out for TEST do not commit allLines = convertRSItems(rs); logger.error("end get case"); logger.error("start insert"); PreparedStatement insertPreparedStatement = conn.prepareStatement(INSERTCREATURE); while(rs.next()){ rowCount ++; for(int i=1;i
amazon.com
US
it seems like stack overflow recommends session.flush and session.clear after every commit when using hibernate but those methods aren't directly available in the pega database java api.
Pegasystems Inc.
US
I do not believe pega uses hibernate at all. Can you share your custom code? what database java apis are you using?
amazon.com
US
Thanks for confirming that it is *NOT* using hibernate under the hood. our next approach will be to try to tune the underlying database better as we think we have eliminated the code as the root cause by running the code against a different database instance and not being able to replicate the issue.
amazon.com
US
we tried tunning the DB and could not get any relief from that, so posting the custom insert code:
we tried tunning the DB and could not get any relief from that, so posting the custom insert code:
int datatRowStartNum = tools.getParamAsInteger(ImmutablePropertyInfo.MODE_JAVAPROPERTY,"datatRowStartNum"); oLog.infoForced("Parse Activity Started"); int headerRowNum = tools.getParamAsInteger(ImmutablePropertyInfo.MODE_JAVAPROPERTY,"headerRowNum"); //int dbThreashold = tools.getParamAsInteger(ImmutablePropertyInfo.MODE_JAVAPROPERTY,"dbThreashold"); int dbThreashold = 2000; String xlFileName = tools.getParamValue("XLFileName"); String pageClassName = tools.getParamValue("pageClassName"); String pageName = tools.getParamValue("pageName"); //tools.getPrimaryPage().addMessage("File Name: "+xlFileName+"\npageClassName: "+pageClassName+"\npageName: "+pageName); String caseID = tools.getParamValue("CaseID"); boolean removeItemsFromDB = tools.getParamAsBoolean(ImmutablePropertyInfo.MODE_JAVAPROPERTY,"removeOldData"); if(caseID==null || caseID.trim().length()==0) { tools.getPrimaryPage().addMessage("Could not retrieve CASE Id. Contact Support by raising TT."); return false; } String [] property2ColumnMap = {"VendorName","Pubcode","VendorNum","VendorSiteCode","InvoiceNum","InvoiceDate","Description","InvoiceType", "InvoiceAmount","CurrCode","Paygroup","TermsName","ImageNumber","ReturnID","LineType","LineAmount","TaxCode","TaxRecoveryRate", "GLString","LineDesc","PONumber","POLineNumber","POShipNumber","PODistNumber","POReleaseNumber","QtyInvoiced","NetUnitPrice", "GLDate","PaymentMethod","DFFXCHGExpenseAccount"}; //String [] mandatoryValues = {"VendorNum","VendorSiteCode","InvoiceNum","InvoiceDate","Description","InvoiceType","InvoiceAmount","LineType","LineAmount"}; //Construct HashSet for mandatory Values java.util.Map mandatoryColumnSet = new java.util.HashMap(); mandatoryColumnSet.put("VendorNum","Vendor Num"); mandatoryColumnSet.put("VendorSiteCode","Vendor Site Code"); mandatoryColumnSet.put("InvoiceNum","Invoice Num"); mandatoryColumnSet.put("InvoiceDate","Invoice Date"); mandatoryColumnSet.put("Description","Description"); mandatoryColumnSet.put("InvoiceType","Invoice Type"); mandatoryColumnSet.put("InvoiceAmount","Invoice Amount"); mandatoryColumnSet.put("LineType","Line Type"); mandatoryColumnSet.put("LineAmount","Line Amount"); mandatoryColumnSet.put("GLString","GL Account"); mandatoryColumnSet.put("CurrCode","Currency Code"); // mandatoryColumnSet.put("TermsName","Terms Name"); com.pega.apache.poi.ss.usermodel.Workbook excelWorkbook = null; Database dbConnection=null; oLog.infoForced("Database Cleanup started"); try { dbConnection = tools.getDatabase(); if(removeItemsFromDB) { ClipboardPage clipboardPage = tools.createPage(pageClassName,pageName+"ToDelete"); dbConnection.executeRDB("DELETE FROM {class:"+pageClassName+"} where CASE_ID='"+caseID+"'",clipboardPage); clipboardPage.removeFromClipboard(); } excelWorkbook = amznaprvlfw_spreadsheetlibrary.GetWorkBookWithFileRestriction(xlFileName,10); if(excelWorkbook==null) { return false; } oLog.infoForced("Database Cleanup Ended"); oLog.infoForced("Parse Excel Started"); com.pega.apache.poi.ss.usermodel.Sheet ofaSpreadSheet =excelWorkbook.getSheetAt(0); if(ofaSpreadSheet==null) { tools.getPrimaryPage().addMessage("Could not find file: " + xlFileName+", Please upload again"); return false; } int lastRow = ofaSpreadSheet.getLastRowNum(); //tools.getPrimaryPage().addMessage("Line Items not found in the spreadsheet "+lastRow+" datarowstart "+datatRowStartNum+" physical "+ofaSpreadSheet.getPhysicalNumberOfRows()); /* if(lastRow < datatRowStartNum){ tools.getPrimaryPage().addMessage("Line Items not found in the spreadsheet "+lastRow+" datarowstart "+datatRowStartNum+" physical "+ofaSpreadSheet.getPhysicalNumberOfRows()); return false; } */ // for(int ctr=datatRowStartNum-1;ctr
Accepted Solution
amazon.com
US
we posted our code and as a bench mark we wrote some pure JDBC code with no hibernate at all and the results were stunning...
the PURE JDBC code took about 4 seconds to insert 64k rows in batches of 1k.
Our Pega Database API code takes about 18-20 minutes to insert the same 64k rows using save deferred.
Pure JDBC (no DAO/HIBERNATE):
logger.error("start get case"); int rowCount = 1; try (ResultSet rs = ps.executeQuery();){ //Convert the result set into a the creaturePOJO list object... //Commented out for TEST do not commit allLines = convertRSItems(rs); logger.error("end get case"); logger.error("start insert"); PreparedStatement insertPreparedStatement = conn.prepareStatement(INSERTCREATURE); while(rs.next()){ rowCount ++; for(int i=1;i