Question
PayPal
IN
Last activity: 30 Nov 2022 12:04 EST
Reading BLOB data from pzPVStream column with Java
Hi Team,
We have a requirement where we need to read the pzPVStream column(in one of our tables)
from a Java application.
As the column is currently compressed and encrypted by PEGA, do we have any detailed documentation or suggestion on how it can be done? How we can decompress and read the BLOB data from pzpvstream column.
At DB level, we can use PR_READ_FROM_STREAM but it fetch only the required particular property inside the BLOB. we want to read the entire BLOB at once for a record.
Can you please help here on this? Thanks!
***Edited by Moderator Marije to add Capability tags***
-
Like (0)
-
Share this page Facebook Twitter LinkedIn Email Copying... Copied!
NCS Pte. Ltd
SG
Hi @ashwinkumars: I see 2 things in the query. Title says to read stream data using java. In this case if you are referring to java code within PEGA application, you can make use of the interface Database and function open to open a work object.
Sample: tools.getDatabase().open(<Page>, false)
Details says as a Java application: If you are referring to a external java application,
- You can make use of the read_from_stream methods if you are using SQL query to fetch data (consider performance impacts. instead you can also try to implement a reporting table/database)
- Try using API to get data which will be more flexible
I don't see a option to read entire blob data from a external Java application. As mentioned earlier, index / reporting table / API is more recommended.
Thanks.
Updated: 28 Nov 2022 7:02 EST
PayPal
IN
@ArulDevan Thanks Arul, To clarify i referred to an External Java application.
From external application, i need to read the pzpvstream column of a table. usually, normal BLOB column we would be able to read from a Java application. But here since its compressed and stored on the column, need more details on how we can decompress the data and read it?
Any help on this is much appreciated. Thanks!
Agriculture Victoria
AU
@ashwinkumars A better solution would be use Pega Indexes to keep the data extracted from blob into Index table(s) or if you need a complete flat RDBMS structure for blob data, you may want to consider Pega BIX. Another solution would be expose a Pega service to get required data on demand via an API and use that API in your java application.
PayPal
IN
@PraveenPuri_old Thanks for the suggestion Praveen. But as per our use case, we need to read the completed BLOB data and we will not able to integrate to a Pega service. Hence we are trying to read the data from database as last option.
Is there a way to decompress the data and fetch it in readable format in the java application from our database?
Updated: 30 Nov 2022 3:15 EST
JPMC
IN
@ashwinkumarsSince (as stated) consuming the data via rest service is not an option, other option(lot of overhead) I can think of is it to create a separate table/column and write an activity(assuming that there are very few cases) to browse through the cases, open the case on to a page and either convert the page into xml or json (depending on your usecase) and store into the newly created column or table.
Masking Technology
NL
Depending on your exact requirements, you could use BIX to extract the full contents of a work object and export it as XML. Then you can read the XML contents using a java application. (still requires file transfer to work).
But I don't know if you just want to query items in the BLOB 'at random', or that you want all the information available at some point in time.
Trying to inflate the BLOB yourself won't work, unless you've got the exact code Pega uses to do it internally.
PayPal
IN
Thanks.
Yes, want to query the data at random. Hence using BIX for extracting all the cases might not work as number of cases are high and blob contents also large.
PayPal
IN
Any one here or from PEGA can help here on this? would it be possible to get to know how we can decompress BLOB data and read from our on premise tables. Thanks.