How to configure Connect-File for AWS S3? (Pega 7.4)
Is it possible to use AWS S3 with Connect-File? I tried to set up with following DSS like File Listener, but it failed on test connection.
Owning Ruleset: Pega-Engine
Setting Purpose: storage/class/awsstore:/type
Value: aws-s3
Setting Purpose: storage/class/awsstore:/bucket
Value: mybucketname
Setting Purpose: storage/class/awsstore:/accesskeyid
Value: myaccesskeyid
Setting Purpose: storage/class/awsstore:/secretaccesskey
Value: mysecretaccesskey
Setting Purpose: storage/class/awsstore:/rootpath
Value: foldername
(I configured bucketname, accesskeyid, secretaccesskey and foldername as the actual name that I set up on S3.)
On Connect-File rule, I input "file://awsstore:/" to Destination Path. While File Listener, I input the same path to Source Location and it works (Test connectivity returns success.)
Does Connect-File support S3? If so, how to configure it? (Most probably it works if it mounts S3 with goofys, but is that the only way?)
-
Like (0)
-
Accepted Solution
Hi tsunm,
Yes, your outline is correct. Just adding two more points for you consideration:
1. If you use pyContent, you need to set it to the BASE64 string of the intended content of your file. If you have the option, please consider using pyStream instead since that will the platform to stream the file content to the repository instead of loading it all into memory.
2. You can call a savable data page from a smart shape or flow action as well as from an activity, please consider those if your use case allows.

Thank you for your comments. It should work. Actually I can successfully create file on S3 with tntdrive from windows. However, I would like to know if there is the way to connect file to S3 without mounting a bucket as local folder by 3rd party tool such as tntdrive, goofy, etc. Since File Listener and Attachment support S3, so I guess there could be the way to do it directly from Pega.
To connect to an S3 bucket and interact with it, use the repository features in 7.4:
AWS S3 is supported OOTB. You will need to define a new repository record, and then use the repository data page APIs to interact with files on it. If you are looking to store pulse or case attachments, these can configured on the application rule form.
regards,
Mayran
Mayran
- Create an Authentication Profile for your S3 account (access key id, secret access key)
- Create a Repository rule with your S3 information (bucket, path and the authentication profile)
- Create an Activity to save file to S3
In the activity, the steps should be like below.
- Load-DataPage, that load "D_pxNewFile" and pass the repository and file path
- Property-Set to D_pxNewFile[repositoryName:"repository",filePath:"filepath"].pyContents. The value should be input stream
- Save-DataPage
- Commit (if necessary)
If I misunderstood it, please let me know.
Regards,

Hello,
So if I understand correctly, you did not use a Connect-File in the end ?
I have a PegaCloud repository available in my System Settings at file://pegacloudrepository:/, but if I set it as destination path, the Test connectivity is giving me a "No such Directory or Folder: file://pegacloudrepository:/" error.

connect file is not supported on cloud, and would not work with S3. Please use the repository data pages, or the OOTB integration with case & pulse attachments.
Accepted Solution
Hi tsunm,
Yes, your outline is correct. Just adding two more points for you consideration:
1. If you use pyContent, you need to set it to the BASE64 string of the intended content of your file. If you have the option, please consider using pyStream instead since that will the platform to stream the file content to the repository instead of loading it all into memory.
2. You can call a savable data page from a smart shape or flow action as well as from an activity, please consider those if your use case allows.