Question
Pegasystems Inc.
BR
Last activity: 22 Dec 2021 4:04 EST
Log4j day zero vulnerability
Hi, from the link https://docs-previous.pega.com/security-advisory/security-advisory-apache-log4j-zero-day-vulnerability it says to delete a row from pr_engineclasses table from RULES schema, however we've identified that others schema also has the table, should it be deleted from all the different schemas which has it or is it just from RULES one?
-
Likes (1)
Vibek Sharma Monisha Alla -
Share this page Facebook Twitter LinkedIn Email Copying... Copied!
Pegasystems Inc.
NL
Hi @KOFER89, in regular setups you should only find that table in your RULES schema. Maybe you have made a backup to another schema or you have multiple environments sharing the same database pointing to different schemas. In any case, please remove this entry from all pr_engineclasses tables in any schema you have, to be sure. After that restart all nodes in your cluster(s).
-
Niladri Mandal
TCS
US
@Eric Rietveld we are in on premise pega platform and i see pega has given hot fixes for each version, if we install that hotfix, do we still need to perform the deletion of logrj related class from pegaengine? or not needed?
Pegasystems Inc.
NL
@Niladri if you've already succesfully installed the hotfix there's no need to remove the JNDI class with the database delete command.
If however you need more time to get and install the hotfix, we recommend to first quickly remove the class as outlined.
-
Niladri Mandal
Cigna
US
@Eric Rietveld , We have removed JNDI class & updated Log4j jars to 2.16 version. And now we see version 2.17 is available. Do we need to update log4j jars to this latest version? Please advise.
Do we still need to install the hotfixes if we have these workarounds in place.
-
Niladri Mandal Brahmeswara Rao
Updated: 21 Dec 2021 12:26 EST
Pegasystems Inc.
NL
Hi @FathimaShaik. The Log4J library continous to receive a lot of attention with more vulnerable edge cases getting uncovered and addressed.
When you have customized your Pega logging subsystem in a particular way, there's a risk of a Denial of Service (DOS) attack, which gets resolved with 2.17. This is likely not the case, so this new vulnerability doesn't impact you, nevertheless we will continou to package later versions of Log4J.
Most important: Be sure to install a hotfix containing log4j version 2.15 (or later) or manually remove the JNDI class, in case you haven't done this yet.
Please continue to monitor our update page. With our current understanding, there's little risk left and you can wait for the next Pega patch release, instead of installing the additional log4j hotfixes.
COGNIZANT TECHNOLOGY SOLUTIONS
US
@Eric RietveldI have a question here. We updated log4j files on stream nodes to 2.15 and remove JNDI lookup from DB as well. We are working on installing hotfix shared for both platform and stream service. I saw that Pega is working on building 2.17 compatible hotfixes as well. So in this case, should we wait until this hotfix is published and also do we need to upgrade log4j to 2.17 version as well?
Or are we good with log4j 2.15 and having currently available hotfixes installed?. Please advice.
Social Security Administration
US
I have the same question. It is all confusing. Earlier discussion mentioned about 2.16 version. But Hotfix include 2.15
Pegasystems Inc.
NL
Hi @MendusC9, the situation has been evolving over the last week with the release of new log4j versions. This means that we have released hotfixes which include 2.15, 2.16 or 2.17.
If you're unsure what to do, you can keep installing the latest hotfixes as suggested in our Security Advisory, until they are all based on 2.17. Note that given the dynamic situation we could see a log4j 2.18 next week, which will update the advise.
Hope this help.
Pegasystems Inc.
NL
Hi @ANOOPS46, the hotfixes that we ship include the updated log4j library, so when you install those,you don't need to also manually update log4j jars. In case you're not running a container platform or are storing the Kafka binaries in a persistent volume, it could be that you're left over with old unused log4j jars on the filesystem of your stream nodes. You can remove those old Kafka binary directories, as they are not used anymore.
As mentioned above the risk of the 2.16 and 2.17 vulnerabilities materializing in Pega context is low, but depending on your exact situation and risk-apatite, you can continue to apply newer log4j hotfixes. At the moment we're providing Stream hotfixes which include 2.17. See: https://collaborate.pega.com/discussion/stream-security-advisory-apache-log4j-217-vulnerability-hotfixes
Hope this helps to clarify the situation.