Question
M
US
Last activity: 25 Sep 2024 4:12 EDT
GPT-4-turbo API Integration Not Working in Pega
Hello,
I am experiencing an issue with the GPT-4-turbo model integration in my Pega application. The REST connector was configured using the correct OpenAI API endpoint (https://api.openai.com/v1/completions
), and I am passing the necessary headers, including the API key.
However, when attempting to make a request to the GPT-4-turbo model, I receive the following error:
You exceeded your current quota, please check your plan and billing details. For more information on this error
Steps I have followed:
- Created the REST connector with the GPT-4-turbo API endpoint.
- Added the necessary API key for authorization.
- Configured the input parameters such as
model
,prompt
,temperature
, andmax_tokens
.
Despite this, the model is not responding as expected.
Could you please advise on troubleshooting steps or check if there is an issue with the API integration on the Pega side?
Thanks in advance!