Member since
07-29-2019
640
Posts
114
Kudos Received
48
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 14447 | 12-01-2022 05:40 PM | |
| 3295 | 11-24-2022 08:44 AM | |
| 4954 | 11-12-2022 12:38 PM | |
| 1795 | 10-10-2022 06:58 AM | |
| 2581 | 09-11-2022 05:43 PM |
01-03-2022
08:22 AM
Thanks @ask_bill_brooks for the reply, I'll check.
... View more
01-02-2022
08:15 PM
1 Kudo
@noamsh_88, to recap:
You started out the thread saying that you are "using Cloudera V5.1.1 with log4j v1.2.17" and asked how you could upgrade to the latest version of log4j on CDH V5.1.1.
@GangWar replied that CDH 5.x is not and would not be tested with a later version of log4j, as CDH 5.x has reached End of Support (open that link and then expand the section labeled "Cloudera Enterprise products" underneath Current End of Support (EoS) Dates) and so if you tried it, you would be on your own.
He also wrote that CDH-5 was not impacted by the log4j vulnerability described in log4j2 CVE-2021-44228
You replied on 2 Jan that you ran the "patch for log4j provided at https://github.com/cloudera/cloudera-scripts-for-log4j" and asked:
how can we verify our env is out from log4j risk?
is there some java classes we should verify inside?
The very first sentence of the README.md file that renders in the browser automatically when one visits the URL you shared earlier for the cloudera-scripts-for-log4j reads:
This repo contains scripts and helper tools to mitigate the critical log4j vulnerability CVE-2021-44228 for Cloudera products affecting all versions of log4j between 2.0 and 2.14.1.
Emphasis added.
As @GangWar indicated, your environment, based on CDH 5.x, should not have had a version of log4j between 2.0 and 2.14.1 installed, and therefore should not have been vulnerable to the the log4j vulnerability described in log4j2 CVE-2021-44228. This is because, as you yourself pointed out in your original post on 23 Dec, you only had log4j v1.2.17 installed in your environment. log4j v1.2.17 is not a version of log4j between 2.0 and 2.14.1.
This also explains why, after you ran the script intended for systems using log4j versions between 2.0 and 2.14.1 on a system using log4j v1.2.17, the log4j V1 jars were not removed.
But since you ran the script for log4j provided at https://github.com/cloudera/cloudera-scripts-for-log4j anyway and presumably still have it handy, you could check manually for log4j .jar files in your environment in a similar manner that the script does and verify for yourself that none of those files still have the JndiLookup.class still present and thereby verify your environment is not at risk to the log4j vulnerability described in log4j2 CVE-2021-44228 (this information is also in the same README.md file on GitHub where the script you ran is being distributed from).
... View more
12-08-2021
04:39 AM
Hello @smdas Thank you very much! It is exactly this confirmation which I needed. Thanks! Best regards, Olek
... View more
11-18-2021
04:53 AM
Cloudera to Databricks migration is happening for us. Since there is a code freeze in Dec-Jan we cannot uninstall Cloudera during this time. Migration project is not yet in production. Our Cloudera infrastructure is in Azure. In case if we cannot access Cloudera Manager after Jan 1st how about deleting the Azure Resource Group for Cloudera? This will delete everything related to Cloudera right? Or is there any option like we go to each node and delete data,agent,server?
... View more
11-09-2021
02:17 PM
Thank you @ask_bill_brooks for your fast reply. Yes that exactly what i want to do and it is like a repetitive task which mean that my task involves many database tables and it will be repetitive. Is there any solution in mind @ask_bill_brooks? Thank you for your time.
... View more
10-21-2021
01:40 PM
Hi @PrernaU , It seems like the logged in user does not have permission to access the interpreter page. The authorization is dependent on how you have configured your shiro authentication. If you have integrated your AD with zeppelin the make sure the role mapping has done correctly and you have the correct user/group defined in the shiro roles section.
... View more
10-19-2021
08:09 PM
Hi @Sipping1n0s I think I can help. First, the current Enterprise Data Platform product offered by Cloudera as of Oct 2021, is Cloudera Data Platform (CDP); Cloudera is the name of the company that markets CDP. Second, in it's on-premises "form factor", Cloudera Data Platform Private Cloud, you can download and install a "free trial" which expires after 60 days in a non-production environment for demonstration and proof-of-concept use cases without obtaining a license. You can read over the operating system requirements for installing the CDP Private Cloud Base trial (which is the easiest way to install CDP Private Cloud) here: Operating System Requirements I am not aware of a Docker image version available for download from Cloudera that would enable you to create a CDP cluster on your desktop and also test the docker containers on multiple clouds, although I am certain that someone with the requisite knowledge of Docker and sufficient skill and abilities with the various required development tools could create one. Indeed, some member of the Cloudera Community may have already done so and be willing to share their method in response to your question. Prior to its merger with Cloudera in 2018, Hortonworks, Inc. distributed a Docker image of its distribution called The HDP Sandbox and that still happens to be available for download here: Deploying Hortonworks Sandbox on Docker (among other places) …along with a tutorial which provides detailed steps to install that combination on Linux, Mac OS X and MS Windows, but that in no way could be called up-to-date or equivalent to what Cloudera markets as an Enterprise Data Platform product today (The Sandbox is based on a version of the base distribution which is nearing end of support status). The Sandbox is intended as a pre-configured learning environment for developers who are just getting started. Getting installations of the HDP Sandbox running on multiple clouds would be challenging, but possible, again assuming a developer knowledgeable about the various required development tools.
... View more
10-11-2021
02:19 PM
Hi @Yogesh771 It would help members of the community in offering possible answers to your question if you posted a link to the tutorial you are presumably following which includes the source code for this python program, describes how to run it and what the expected output is.
... View more
10-04-2021
07:44 AM
I am following up here on one part of the original question, regarding the Apache Sqoop project being retired, just for the record (which is to say for the benefit of people who might arrive at this thread via search engine at some point in the near-to-medium term future). I still stand behind what I previously wrote about the relative strengths/weaknesses of using these two tools to extract data from SQL server and ingesting it to hdfs, but I do want to clarify that while Sqoop was moved to the Apache Attic in June 2021, the software will continue to be supported by Cloudera and shipped as part of CDP Public Cloud and CDP Private Cloud. See Cloudera's statement of support on this matter here: Apache Sqoop Support on Cloudera Data Platform
... View more