Member since
02-27-2020
157
Posts
38
Kudos Received
43
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
48 | 05-20-2022 09:46 AM | |
53 | 05-17-2022 08:42 PM | |
91 | 05-06-2022 06:50 AM | |
125 | 04-18-2022 07:53 AM | |
106 | 04-12-2022 11:17 AM |
05-23-2022
03:26 PM
@yagoaparecidoti Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future. Thanks!
... View more
05-19-2022
11:32 PM
Hi Alex, thank you for having confirmed that. I'll proceed as you suggest. Regards Andrea
... View more
05-11-2022
10:22 AM
@theano Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future. If you are still experiencing the issue, can you provide the information @GangWar has requested? Thanks!
... View more
05-11-2022
12:01 AM
Hi @aakulov, we are using on-prem(bare-metal cluster ) Cloudera manager version 7.6.1 Cloudera Runtime 7.1.7 (Parcels) we configured the AWS credentials the same way as per the links shared by you, but getting unable to load AWS credentials with directory included in s3a Url (ex:"s3a://test/directory")
... View more
05-10-2022
12:45 PM
1 Kudo
This likely needs attention from Cloudera support. Please open a case through your mycloudera portal. Regards, Alex
... View more
05-06-2022
04:39 PM
Thanks for that! That's helpful.
... View more
04-19-2022
01:03 PM
Happy to hear, @Data1701 . Accept reply as Solution if the answer helped.
... View more
04-18-2022
08:01 AM
Hi @Jaguare , What you likely want to use is not sqoop, but hdfs' native distcp command. There are guides on the internet on how to do this. A quick google found this: https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/apache-hadoop-on-premises-migration-best-practices-data-migration Hope this helps. Regards, Alex
... View more
04-15-2022
12:34 AM
Hi @aakulov Welcome and thank you very much for the reply. Regards, Ragav
... View more
04-12-2022
03:24 PM
@THR-Mario , if you want to use Grafana as a general-purpose data visualization tool, then unfortunately that's not its purpose. Grafana is meant for time-series visualization, and in the context of CDP is used for metrics monitoring. In this task Grafana is great! If you want to use Grafana to query Hive, Impala, or other large datasets, you will have a bad time. Sorry, that's not what you want to hear, but that's the reality. Regards, Alex
... View more
04-11-2022
01:02 AM
License file has a valid expiration date, currently December of 2022 It is a development license Licence was recognize in another intent of installation via installler on another VM under CentOs 7
... View more
03-21-2022
02:02 PM
1 Kudo
If you are talking about the little icons that show up next to the table names in the table list on the left-hand side of your Hue editor, then yes, "table" icon is a proper physical table, "eye" icon is a view defined on top of some statement. Hope this helps, Alex
... View more
03-21-2022
07:53 AM
2 Kudos
Hi @rahuledavalath , Generally, having internet access on the nodes where you are installing CDP Private Cloud simply saves you a few manual steps. However, it is absolutely acceptable to have nodes without internet and do an installation on "air-gapped" environment (e.g. see here: https://docs.cloudera.com/cdp-private-cloud-experiences/1.3.3/installation/topics/cdppvc-installation-airgap.html) As for license activation, there is no need for internet access either. The license you get from Cloudera is a text file. As long as you upload it to Cloudera Manager in the appropriate installation step, there is no internet validation that needs to happen. Everything is self-contained on your hosts. Hope this helps, Alex
... View more
02-20-2022
09:19 PM
@banshidhar_saho, Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
02-19-2022
02:42 AM
Hello, This command is not working, the binaries are running under the parcel installation and they are not loaded in the systemctl. I've tried to execute the binaries from /opt/cloudera/parcels/CDH-7.1.6-1.cdh7.1.6.p0.10506313/lib/kudu/sbin/kudu-tserver but it doesn't work properly. root@server:~# sudo /opt/cloudera/parcels/CDH-7.1.6-1.cdh7.1.6.p0.10506313/lib/kudu/sbin/kudu-tserver --help kudu-tserver: Warning: SetUsageMessage() never called Flags from ../../../../../src/kudu/cfile/block_cache.cc: -block_cache_capacity_mb (block cache capacity in MB) type: int64 default: 512 -block_cache_type (Which type of block cache to use for caching data. Valid choices are 'DRAM' or 'NVM'. DRAM, the default, caches data in regular memory. 'NVM' caches data in a memory-mapped file using the memkind library. To use 'NVM', libmemkind 1.8.0 or newer must be available on the system; otherwise Kudu will crash.) type: string default: "DRAM" -force_block_cache_capacity (Force Kudu to accept the block cache size, even if it is unsafe.) type: bool default: false [...]
... View more
02-13-2022
09:22 PM
1 Kudo
The drivers are provided with Cloudera licenses for the purpose of download and use with Cloudera Products. If there is a need for any other use of these drivers, you will have to engage directly with Simba (the driver provider) and agree with their licensing terms.
... View more
02-13-2022
03:32 PM
1 Kudo
Hi @Melon , Sounds like you are wanting to send an email alert (e.g. when a CDSW job finishes) via your corporate SFTP server. Is that correct? For configuring CDSW SMTP connection you will need to have at least CDSW Admin role and go to Admin > Settings (tab). Here under SMTP you will need to provide the following information: SMTP host address SMTP port SMTP username SMTP password Optionally, you may check "Use TLS" if your SMTP server requires this. What you are talking about in terms of VPN (Citrix Gateway) is something you may want to check with your IT organization. You should be able to upload and download project files in CDSW via the browser, when you are connected through VPN. Hope this help, Alex
... View more
02-11-2022
01:11 AM
Hi @aakulov , thanks for your reply. In the last few days we had it fixed, thanks to CE support' help. Not sure why, but he decided to reinstall Hive from scratch, and replace it with **Hive_on_Tez**. The sqoop commands now seems to run fine, after updating the --hs2-url parameter accordingly (and upon regeneration of kerberos tickets for hive) thanks anyway for your suggestions -- hope my answer will be useful to someone kind regards, gr
... View more
02-01-2022
12:05 AM
Is it possible for you to post your sample sqoop job? Thanks @sass
... View more
01-21-2022
07:15 AM
better than nothing, although the whole information (with metadata) would be reconstructed using the associated postgreSQL (not possible in our case as CDSW and kubeclt are down)
... View more
01-19-2022
11:28 AM
Hi @Chhavi , the discussion of HIPPA certification is an intricate one. I would recommend to talk to your Cloudera account representative as there are many details to go through. Thank you, Alex
... View more
10-27-2021
12:58 AM
@Alexios, Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
08-10-2021
12:00 PM
Could you provide the output of the DESCRIBE command for your table in Kudu.
... View more
07-13-2021
10:42 AM
Hi @Anyy , Your SQL statement seems to be malformed. You have "from" keyword duplicated. Please fix and try running again. If that helps, please accept as solution. Regards, Alex
... View more
07-08-2021
03:03 PM
1 Kudo
Strange.... As a test can you replace your hard-coded toppic name with something like ${tableName} or just try a different hardcoded string and see if that gets you further along. Not saying that's a solution, but trying to eliminate causes. Also, you mentioned you've implemented other flows already. Could it be that the topic name BUBB-MX_ALL_SOFT is already taken in your kafka cluster and thats why goldengate handler can't create the topic (though I would expect it to just write to the existing topic, instead of throwing an error). I also found this on Oracle site that may have your answer. You'll need an Oracle account to look if there is a solution though. https://support.oracle.com/knowledge/Middleware/2512462_1.html Could this also be an encoding or trailing character issue in the props file between Unix and AIX? Overall this sounds like a GoldenGate error that may be better answered by Oracle community. Regards, Alex
... View more
07-08-2021
02:24 PM
Hi @roshanbi , You can simply put your sqoop command in a shell file and then have Cron run that script according to your desired schedule (depending on how frequently you expect the data to be updated in the Oracle source table). Alternatively, you can use Oozie to orchestrate the running of the sqoop job. Regards, Alex
... View more
06-23-2021
05:32 AM
@roshanbi, have you resolved your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
06-23-2021
05:31 AM
@roshanbi, have you resolved your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
06-21-2021
01:36 AM
Hi @sakitha Seems to be a known issue. Is the topic whitelist is set to " * " ? Can you please try with dot - " .*" Let us know if that works for you. Regards, ~ If the above answers your questions. Please give a thumbs up and mark the post as accept as solution.
... View more