Member since
07-18-2017
14
Posts
0
Kudos Received
0
Solutions
11-10-2022
07:11 AM
We have a business requirement to maintain hadoop clusters as active-standby, here we are expecting solution is when data writing happens sync data between two clusters and while reading read from standby if active is down. If it is having the solution please provide the related links also Thanks & Regards Rahul
... View more
Labels:
03-21-2022
12:29 AM
Hi I want to install latest CDP private cloud to our environment. Do we need to have internet access in the CDP nodes for the license activation ? (1 year support license ) Thanks & Regards Rahul
... View more
- Tags:
- CDP
Labels:
- Labels:
-
Cloudera Data Platform (CDP)
03-16-2022
09:17 PM
Hi @shehbazk Thanks for your reply. I am not clear with below answer. 2. In the production servers we are not having internet connectivity is it mandatory to have connectivity to cloudera servers for the license activation.? Yes you can Are you telling internet is not required to activate the license while uploading the license to cloudera-manager ?
... View more
03-15-2022
10:02 PM
Hi @shehbazk My team is almost at the final stage of getting the license. I have some queries which I didn't found on the given link. 1. With same license can I have different clusters with different cloudera manager. (eg: We are asking for 15 node license but 3 cluster of 5 nodes. There won't be any connectivity between these environments so we need separate cloudera manager for each environments with the same license). Is it possible ? 2. In the production servers we are not having internet connectivity is it mandatory to have connectivity to cloudera servers for the license activation.? 3. If I use the license in our R&D environment to test above cases after testing can I use same license for production clusters or it will say the license already used ? Please assist on this. Thanks & Regards Rahul
... View more
03-15-2022
01:16 AM
If I have 3 environments eg :SIT,UAT,PROD each is having 5 nodes(total 15 nodes). I need 3 independent CDP clusters which will be having 5 nodes each. If I take a CDP license of 15 nodes can I create 3 independent clusters using the same license or I need to take separate license for each. Please assist on this Thanks & Regards Rahul P
... View more
09-30-2021
01:03 AM
I am having a HDP hadoop cluster. I took snapshot of phoenix table from this cluster & I cloned that table to a CDP cluster. Hbase table was created from the snapshot & the hfiles are imported to the table's data directory . While selecting the table from phoenix shell, empty data it is showing. If I do a scan operation from hbase shell data is displaying (encoded). Is there any workaround for this issue. Thanks
... View more
Labels:
08-08-2018
06:03 AM
Dear Sandeep From this image i thought that deafult is tez and it also support llap& spark https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.0.0/hive-overview/concepts/images/hive-sql-query-execution-process.png
... View more
08-07-2018
06:42 AM
I am using HDP-3.0. In previous versions hive.execution.engine was mapreduce/tez. Now in hdp3.0 that selection drop down is not there and default one is tez. some where i read that hive in hdp3.0 support tez(default),llap,spark. Is there any provision to directly set spark as the execution engine. Thanks & Regards Rahul
... View more
Labels:
08-02-2018
12:57 PM
While upgrading to hdp-3.0 from hdp-2.6x i am getting following error in ambari. operating System matching redhat6 could not be found. my linux version is centos6.x and ambari version is 2.6x how can i fix this issue? is there is no support for linux version lower than 7?
... View more
Labels:
06-29-2018
09:44 AM
it comes at the time of mapreduce only right ?. if then it will be the map/reduce process. please grep the process id and take what exactly the process it is. If it is map/reduce process then one option is to recheck your mapreduce code. If that is also fine yarn cgroup may help. If you want to kill this process you need to kill the yarn job associated with it. To get the runnning yarn applications you can use RM UI or yarn application -list command To kill job : yarn application -kill application_id
... View more
06-29-2018
08:32 AM
In latest HDP 2.6.5 hbase version which maintains is hbase1.1.2. I agree that lot of bug fixes of future releases are added in this version. Some major improvements is not included in this version because i think lot of major class changes are there(eg: releted with indexing). The phoenix version which come with hdp is 4.7, the latest availble phoenix is phoenix4.14. Most new improvements and bug fixes of phoenix is not integrating to hdp's phoenix version. Thats why it has many bugs (mainly in hive,hbase,spark integrations). I can't manually test new phoenix with hdp's hbase because it's expecting hase1.1.9(some class are changed from hbase1.1.2). Can you please tell whether/when we can expect a updation in hbase/phoenix. Please treat this as priority. Thanks & Regards Rahul P
... View more
Labels: