Question
Hitachi
JP
Last activity: 15 May 2024 10:54 EDT
About EFS, which is the log output destination of Pega and Stream
Please Tell me about the need for clustering in ElasticSearch.Please tell me about EFS, which outputs PEGA and Stream logs.
Since multiple development environments are required for testing, it is assumed that the namespace is split and two pega-web and pega-stream are built.
The premise is to build one EFS for PEGA log output and one for stream log output.
For such a configuration, how many EFS should I build?
I would like to build one for PEGA log output and one for stream log output, and separate folders by specifying the log output destination. Is it possible?
Please let me know if there are any additional steps required.
Also, do I need to build two EFSs each for PEGA log output and stream log output?
If so, what is the reason?
Also, please tell me the steps to configure log output when two EFSs are configured.
***Edited by Moderator Rupashree S. to add Capability tags***
@HideoS17115325 did you do any research on the forum and on our Documentation server?
Elasticsearch clustering is important for ensuring high availability and scalability. It allows data to be stored in different nodes, which can be accessed simultaneously, improving performance and preventing data loss.
As for the Amazon Elastic File System (EFS), it can be used as a persistent volume for storing Pega and Stream logs. You can indeed separate folders by specifying the log output destination.
In terms of the number of EFSs to build, it depends on your specific requirements and the complexity of your environment. If you have multiple environments and you want to separate the logs for each, then building separate EFSs for each environment could be a good approach.
To configure log output when two EFSs are configured, you would need to specify the different EFSs as the output destinations in your logging configuration.
Please note that the exact steps can vary depending on your specific setup and requirements.
⚠ This is a GenAI-powered tool. All generated answers require validation against the provided references.
@HideoS17115325 did you do any research on the forum and on our Documentation server?
Elasticsearch clustering is important for ensuring high availability and scalability. It allows data to be stored in different nodes, which can be accessed simultaneously, improving performance and preventing data loss.
As for the Amazon Elastic File System (EFS), it can be used as a persistent volume for storing Pega and Stream logs. You can indeed separate folders by specifying the log output destination.
In terms of the number of EFSs to build, it depends on your specific requirements and the complexity of your environment. If you have multiple environments and you want to separate the logs for each, then building separate EFSs for each environment could be a good approach.
To configure log output when two EFSs are configured, you would need to specify the different EFSs as the output destinations in your logging configuration.
Please note that the exact steps can vary depending on your specific setup and requirements.
⚠ This is a GenAI-powered tool. All generated answers require validation against the provided references.
Configuring logging for Pega Platform deployments
Pega Infinity Deployment Changes FAQs
Third-party externalized services Deployment Changes FAQs
File and content storage
Uploading and Downloading from file repositories (Amazon s3 as example)
Multi-datacenter setup in 8.6.1 with two Openshift clusters
What AWS storage services supported for Kafka streaming data / persistence
Streaming Pega logs to an external Amazon S3 bucket
ECS Format JSON Logs in Pega 8.5
Kubernetes Cloud Environment - How to Deploy and Manage the PODs