Member since
07-29-2019
640
Posts
114
Kudos Received
48
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 14469 | 12-01-2022 05:40 PM | |
| 3296 | 11-24-2022 08:44 AM | |
| 4955 | 11-12-2022 12:38 PM | |
| 1796 | 10-10-2022 06:58 AM | |
| 2589 | 09-11-2022 05:43 PM |
10-03-2021
04:15 PM
Hello @xyz123 It may be true that the HDP 2.6.5 sandbox requires less than the 10 GB required by 3.0.1, but that does not mean that you'll be able to get everything working on a Mac with 8 GB RAM using on VirtualBox. IIRC, you'll still need to allocate 8 GB RAM to the virtual machine, and if that is not possible on a Big Sur-running Mac with 8 GB RAM total, then you are going to have to do without some services (at a minimum; it still may not be possible with the RAM available). Some of the services you describe when you say you "end up with a dashboard full of red flags", simply don't start by default when running in a memory-deficient environment. if you're looking for pointers on troubleshooting the HDP Sandbox, I strongly recommend closely reading the tutorial Learning the Ropes of the HDP Sandbox
... View more
09-29-2021
01:56 PM
1 Kudo
Hi @Jaspal The more detail you provide the better community members can assist with your question. I think you'll need to provide a tad more detail about what you mean when you say you "… want to mask these data sources" or "mask Hive tables". What does the result look like when you have the "mask" you desire in place?
... View more
09-28-2021
10:24 PM
In June 2021, Apache Sqoop was retired and moved to the Apache Attic. While Sqoop will no longer be maintained at Apache, Cloudera CDP Public and Private Cloud customers can still expect full support including patches, hotfixes and prompt consideration of enhancement requests. For more information, please see the following resources: Migrating Data Using Sqoop in CDP Public Cloud Apache Sqoop changes after upgrading from CDH to CDP Private Cloud Base Apache Sqoop's apache.org page Apache Software Foundation's Board Resolution Terminating the Apache Sqoop Project 16 June 2021 Apache Sqoop in the Apache Attic
... View more
Labels:
09-27-2021
03:46 PM
Hi @vijaysahu You didn't provide the version of Impala you're targeting, but assuming that what you are using is fairly recent, I would answer: Yes, it is possible, but probably not in any automated fashion. The Impala equivalent to SQL Server Stored Procedures are called User-defined functions (or UDFs). Starting in Impala 1.2, you can code your UDFs in C++ and/or Java instead of the proprietary programming language Transact-SQL (T-SQL) commonly used in MS SQL server. You can start reading up on Impala User-defined functions here: User-Defined Functions (UDFs) I am not aware of any translator or "converter" that will take T-SQL code and transform it into the equivalent syntax in Java (for example) that can be used in Impala; perhaps another member of the Cloudera Community is aware of one and will weigh in here with a pointer. Even if one is available, I think you'd be better off sitting down, doing some analysis and getting an understanding of what the original procedure with the more than 500 lines of code does and then write a new UDF or set of UDFs that does the equivalent thing in Impala (there are important limitations to the functionality of UDFs that don't limit corresponding T-SQL procedures). Any competent software developer should be able to accomplish that task in a reasonable amount of time, given reasonable complexity of the original code, and probably less time than it will take you, in terms of calendar days, than it would take to acquire and learn the quirks of an automated translator. That person might decide that there are better ways of satisfying the requirements than using UDFs and that the original stored procedures never should have been written at all.
... View more
09-16-2021
04:24 AM
Hi William Please try below URL. Note i do not confirm the code. https://github.com/cdarlint/winutils regards Jay
... View more
09-13-2021
03:32 AM
@ask_bill_brooks Thanks for your reply Bill!! Though the threads are separate for DROP and ADD partition but I didn't find any race condition/issue in hive-server2 logs when this error occurred. DROP partition had completed executing before ADD partition command started processing. Also, DROP partition is just a precautionary step in our application (only helpful in case of reruns or duplicate processing) as daily we receive a new file once and respectively a NEW partition gets created for this new file. Hence, I am pretty sure this is not the actual reason. I assume that this has something to do with Hive retrying internally to execute the ADD partition causing it to fail in one of the retries but I don't have any proof to establish this theory (Nothing in hive-server2 logs as such to determine this could be the reason).
... View more
09-02-2021
09:22 AM
Hi @Algo This situation may become less confusing when you learn that before the Kudu 1.11 RPMs were released, Cloudera changed its download policy and only made that set of RPMs available from a private repository. Now, to download the software from Cloudera's private repository, you need a valid Cloudera Subscription. Please see the announcement here: Transition to private repositories for CDH, HDP and HDF. If you do not yet have a valid Subscription and are interested in purchasing one, please contact the Cloudera sales team.
... View more
08-18-2021
10:26 PM
Hi @cdhkant1 I think I can help a little bit. First, the current Enterprise Data Platform offered by Cloudera as of Summer 2021, is Cloudera Data Platform (CDP), which has two "form factors", Private Cloud and Public Cloud. The latter is for running as a platform on public Cloud Service Providers, or CSPs such as AWS, Azure and GCP. The on-premises offering was named CDP Data Center at initial release time back in November 2019 and is now called CDP Private Cloud. So CDP Data Center is the original name for the product now called CDP Private Cloud.
... View more
08-18-2021
05:14 AM
Hi, In case we have set up a local repository, is there a chance to find the file allkeys.asc from somewhere and place it in cloudera-repos/cm6 directory? We have a cloudera cluster installed and running from before February 2021 (unfortunately it was shut down for a few months), is it possible to use the already installed packages/parcels in cloudera-manager machine and host machines to set up properly the local repository? Thank you in advance, Alexander
... View more
08-15-2021
11:55 AM
Hi @igorufrn The short answer is that there is no "free version of the CDH" available from Cloudera for download. For the details, you can read the responses posted when a question similar to yours was previously asked and answered here: Ho[w] to subscribe [to] CDH 6.3.2 express The credentials required to access the private repository that all legacy versions of Cloudera's distribution are now available from are not generally the same ones to access Cloudera's website or the Cloudera community. In other words, the personal email address you use for logging into the Cloudera community isn't going to work as a userid for accessing the private repository archive.cloudera.com. Instead, people with a valid Cloudera subscription can generate repository credentials from a CDH license key, and there is a link included in the community thread I mentioned above to an announcement which in turn has links to extensive documentation on how to download and install certain earlier versions of CDH. If you are genuinely looking to evaluate a current data platform for use within your company, you can currently do so without a preexisting Cloudera subscription by downloading and installing the Trial Version of CDP Private Cloud Base Edition of Cloudera Data Platform.
... View more