Question


prathyusha engineering college
IN
Last activity: 31 Jan 2025 7:17 EST
Google GCP bigquery Integration with pega
Hi , anyone done Google GCP bigquery Integration with pega
-
Reply
-
Share this page Facebook Twitter LinkedIn Email Copying... Copied!
Accepted Solution
Updated: 25 Mar 2024 7:41 EDT


Pegasystems Inc.
GB
@ShanthoshKumarkK We do not offer any additional integration from the cloud beyond what the platform provides.
@rygug as SME, please provide confirmation.
Updated: 15 Mar 2024 10:16 EDT


Pegasystems Inc.
GB
@ShanthoshKumarkK please clarify your requirement.
Did you already find this post Google bigquery Integration ?


prathyusha engineering college
IN
@MarijeSchillern As part of my POC Im trying to save the data to GCP Bigquery table from pega . I don't find any steps to connect with big quary table . Kindly share any input
With regards,
Shanthosh k


LTIMindtree
IN
-
Google Cloud Project:
- Set up a Google Cloud Project: Google Cloud Console.
- Enable the BigQuery API for your project.
-
Service Account:
- Create a service account in the Google Cloud Console with appropriate BigQuery permissions.
- Download the service account key as a JSON file.
-
BigQuery Dataset and Table:
- Create a dataset and table in BigQuery where you want to store or retrieve data from Pega.
Steps for Integration:
-
Create a Pega Data Page:
- In Pega, create a Data Page that will be used to interact with Google BigQuery.
- Configure the Data Page to use a Connect-SQL rule to connect to Google BigQuery.
-
Configure the Connect-SQL Rule:
- In Pega, use a Connect-SQL rule to define the SQL statement that interacts with Google BigQuery.
- Use the service account credentials and information about your BigQuery dataset and table.
-
Implement Data Transformations:
-
Google Cloud Project:
- Set up a Google Cloud Project: Google Cloud Console.
- Enable the BigQuery API for your project.
-
Service Account:
- Create a service account in the Google Cloud Console with appropriate BigQuery permissions.
- Download the service account key as a JSON file.
-
BigQuery Dataset and Table:
- Create a dataset and table in BigQuery where you want to store or retrieve data from Pega.
Steps for Integration:
-
Create a Pega Data Page:
- In Pega, create a Data Page that will be used to interact with Google BigQuery.
- Configure the Data Page to use a Connect-SQL rule to connect to Google BigQuery.
-
Configure the Connect-SQL Rule:
- In Pega, use a Connect-SQL rule to define the SQL statement that interacts with Google BigQuery.
- Use the service account credentials and information about your BigQuery dataset and table.
-
Implement Data Transformations:
- If needed, implement data transformations in Pega to ensure the data exchanged between Pega and BigQuery is in the expected format.
-
Test the Integration:
- Test the integration by running the Data Page or other Pega components that use the Connect-SQL rule.
- Verify that data is correctly exchanged between Pega and Google BigQuery.
-
Handle Authentication:
- Ensure that the service account key is securely stored and managed in Pega.
- Implement proper authentication mechanisms, such as securely managing and rotating credentials.
Hope this helps. create the google account and eanble the api to use and consume in pega have fun :).


prathyusha engineering college
IN
Thanks for you valuable input, it will help once the database connection is established, but what we are expecting is to establish a connection with big query table, like JDBC URl or any other easiest way to establish a connection with Google BigQuery table from pega .
kindly help me on this @burrs @kerrb @Vinoth
Thanks...
Accepted Solution
Updated: 25 Mar 2024 7:41 EDT


Pegasystems Inc.
GB
@ShanthoshKumarkK We do not offer any additional integration from the cloud beyond what the platform provides.
@rygug as SME, please provide confirmation.


Alamaticz
IN
@Soham_Chanda1107Thanks a lot for the information and also I want to know connect AWS GLUE with Pega BIX 1. Pega: Generate data in JSON format 2. JSON: Store JSON data in file or send directly to AWS 3. AWS S3: Store JSON file in S3 bucket 4. AWS Glue Crawler: Detect schema and create metadata table 5. AWS Glue: Transform and process JSON data into structured format 6. AWS Redshift: Load transformed data into Redshift cluster 7. Database (Redshift): Store data in Redshift database 8. Power BI: Connect to Redshift database and create visualizations and reports this steps are correct you can reply in Proprietary information hidden