Skip to main content
Background

Pega GenAI™

Frequently Asked Questions

General

Open Accordion Close Accordion
For a list of Pega GenAI supported products, see Pega GenAI in Pega Cloud documentation.
Open Accordion Close Accordion
Pega GenAI capabilities are available for Pega Cloud clients with active subscriptions and for clients that connect to Pega Cloud for Pega GenAI services. To learn how to enable Pega GenAI, see Enabling Pega GenAI in Pega Cloud documentation.
Open Accordion Close Accordion
Yes, you can use customized Pega GenAI Rules. To learn how to customize Pega GenAI Rules, see Creating a Connect Generative AI Rule in the Pega Platform documentation.
Open Accordion Close Accordion
Generative AI is used to provide suggested templates throughout the Blueprint experience to give you a starting point for your design and to unlock new ideas for how to approach workflow problems. For example, generative AI can suggest the following items:
  • Case types based on application description
  • Case lifecycles based on case type and application descriptions
  • Case data models based on case type and application descriptions
  • Data objects based on application descriptions
  • Personas based on application descriptions
When objects, for example, case types, lifecycles, data, and personas are edited, they are not resent to generative AI models. For more information on building an application with Blueprint, see Creating a new application from a Blueprint.
Open Accordion Close Accordion

Artificial Intelligence (AI) at Pega is multifaceted, encompassing various categories designed to optimize workflows, enhance productivity, and transform legacy systems. Many years before generative AI transformed the way we think about what is possible with AI, Pega established a reputation as a market leader in analytical AI technologies like real-time decisioning and process AI. For more information on the different categories of AI at Pega, along with the specific Pega capabilities that use each type of AI, see Overview of Pega AI in Pega Cloud.

Data Security and Privacy

Open Accordion Close Accordion
Through a few different strategies, including:
  1. Enabling clients to mask sensitive data elements provided by clients to mitigate the possibility that they are sent to a public large language model (LLM).
  2. By using public LLMs that are stateless and that do not use client data to train their models.
  3. Through the use of industry standard, security best practices for encryption of data in transit and data at rest.
For more information about data safety practices, see Using Pega GenAI securely in Pega Cloud.
Open Accordion Close Accordion
The large language models (LLM) do not understand Pega meta-application language, so the actual application is never shared. Depending on the design-time use case, Pega can, at the client’s discretion, send prompts in support of use cases like “suggest case types,” “suggest data objects,” or “suggest pick list.” These prompts include context derived from your applications, and the responses returned from the LLM are interpreted by Pega and turned into associated Pega application artifacts. Existing application workflows or segments of these business processes are never sent to the LLM in support of these design time features. As covered in Pega GenAI data security in Pega Cloud, the LLM is stateless, and Pega has taken additional steps to ensure any review of prompts submitted by Pega are not stored for any review by our LLM provider.
Open Accordion Close Accordion

No, client data is never used by Pega or large language model (LLM) providers to train models.

Open Accordion Close Accordion
No, Pega does not store prompts and completions data generated by client applications and sent to Pega GenAI nor does it store the responses provided by Large Language Model providers. For more information, see Pega GenAI data security in Pega Cloud.
Open Accordion Close Accordion
Data in transit is secured using Transport Layer Security (TLS), and all connections are authenticated and authorized using industry standard best practice concepts using time-based tokens (JSON Web Tokens). Pega GenAI resources are secured using keys and are integrated into Pega’s Identity Provider to prevent unauthorized access. For more information about data security with Pega GenAI, see Pega GenAI data security in Pega Cloud.
Open Accordion Close Accordion
Pega GenAI uses the following methodologies:
  1. Microsoft AI Red Team Methodology
  2. OpenAI Safety Best Practices
  3. Microsoft required mitigations
  4. OWASP Top 10 for Large Language Model Applications
  5. OWASP Cloud-Native Application Security Top 10.
For more information about data security with Pega GenAI, see Pega GenAI data security in Pega Cloud.
Open Accordion Close Accordion
Yes, exported Blueprints are encrypted before they are exported by using a key known to Pega Platform, and can only be decrypted by Pega at import time. For more information about data security with Pega GenAI, see Pega GenAI data security in Pega Cloud.

Bias, Content Management, and Model Performance

Open Accordion Close Accordion
There are number of concepts and practices employed by Pega to mitigate the impacts of bias and hallucination, some of which are described below. It should be noted that Pega cannot eliminate all possibilities of bias and hallucination – and clients should be aware of this when applying the usage of Pega GenAI to different client use cases. Most Pega use cases of generative AI by its clients will have a Human in the Loop component. As described above, Human in the Loop is one effective safeguard that can mitigate possible impact from bias or hallucination of large language models (LLM). Pega uses “Temperature,” which is a hyperparameter used in some natural language processing models, including ChatGPT, to control the level of randomness or "creativity" in the generated text. Pega sets these values based on use case to deliver optimal results for our clients, while minimizing the randomness and creativity of the answer. We leverage this model parameter to make the response more predictable. Pega also uses Retrieval Augmented Generation (RAG), which utilizes a client-provided knowledge base which provides relevant information (non-parametric knowledge) from the knowledge base in the input prompt. This governs and aligns the generative nature of large language models with a fact-based search, which can help mitigate Hallucination problems. For more background on how RAG is implemented in Pega Knowledge Buddy, see How Knowledge Buddy works. For more information about data safety practices, see Using Pega GenAI securely in Pega Cloud.
Open Accordion Close Accordion
Yes, the public large language model (LLM) supporting Pega GenAI runs all prompts and completions through content filtering: the system detects and takes action on specific categories of potentially harmful content in both input prompts and output completions. The content filtering models for hate, sexual, violence, and self-harm categories have been specifically trained and tested on the following languages: English, German, Japanese, Spanish, French, Italian, Portuguese, and Chinese. For more information about data safety practices, see Using Pega GenAI securely in Pega Cloud.
Open Accordion Close Accordion

Pega adopts specific model versions provided by our large language model (LLM) provider(s) and updates to later versions on a fixed schedule, for example, the release of a new Pega Infinity version. Since model versions are not actively trained while they are deployed, which is the leading cause of drift, this provides a safeguard against deployed solutions.

Data Localization and Compliance

Open Accordion Close Accordion
In all Pega GenAI deployments, a Pega client-selected deployment region will use a region(s) from the available global Microsoft Azure deployment regions known as Pega GenAI Service Deployment Regions. At deployment time, Pega endeavors where possible to use a Region from the available Pega GenAI Service Deployment Regions which is a comparable region to the client-selected deployment region. Pega will pair a client-selected deployment region within the US or EU with a comparable Pega GenAI deployment region(s) also in the US or EU. For more information on Pega Cloud deployment regions, see Deployment regions for Pega Cloud. For clients using Pega GenAI PremBridge to connect to Pega GenAI, the Pega client selected deployment region used for GenAI PremBridge will follow the pairing and geography model as described above. For more information on Pega GenAI PremBridge, see Integrating client-managed deployments with Pega GenAI.
Open Accordion Close Accordion

Pegasystems does not own results from the client’s use of Pega GenAI. The client understands and acknowledges that Generative AI systems, including Pega GenAI, may produce similar responses to similar prompts or queries from multiple individuals and that clients' rights in results might not be enforceable against third parties.   To learn more, work with your Pega representative to sign the Pega GenAI Addendum and enable Pega GenAI. For more information, see Enabling Pega GenAI on Pega Cloud.

We'd prefer it if you saw us at our best.

Pega Collaboration Center has detected you are using a browser which may prevent you from experiencing the site as intended. To improve your experience, please update your browser.

Close Deprecation Notice