Member since
04-13-2016
422
Posts
150
Kudos Received
55
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1086 | 05-23-2018 05:29 AM | |
3531 | 05-08-2018 03:06 AM | |
900 | 02-09-2018 02:22 AM | |
1887 | 01-24-2018 08:37 PM | |
4535 | 01-24-2018 05:43 PM |
08-10-2016
09:44 PM
@Aditya Konda Yes we can, 1st upgrade the ambari to latest version and the upgrade HDP
... View more
08-10-2016
09:02 PM
1 Kudo
@Aditya Konda Hope this link helps you http://docs.hortonworks.com/HDPDocuments/Ambari-2.2.1.0/bk_upgrading_Ambari/content/_upgrading_hdp_stack.html
... View more
08-10-2016
07:52 PM
@Josh Elser Thanks for the quick response. Yeah, that true we need to give export HBASE_CONF_PATH=HBASE_CONFIG_DIR where HBASE_CONF_PATH=/etc/hbase/conf but I have seen in the link it was using both. I'm familiar setting in /etc/profile but I don't want to make any OS level config changes. Is there anyway we can set it from Ambari?
... View more
08-10-2016
07:47 PM
@Sunile Manjee Thanks for the information, I have seen that but we are using the HDP 2.3, So I would like to know is there any better way instead of symoblic links
... View more
08-10-2016
07:12 PM
Hi, To configure Phoenix to run in a secure Hadoop cluster, every time I need to set HBASE_CONF_PATH=/etc/hbase/conf:/etc/hadoop/conf Is there anyway we can set that path in cluster level instead of running every time export HBASE_CONF_PATH=/etc/hbase/conf:/etc/hadoop/conf before making jdbc connection using PQS? FYI... I have tried setting up in hbase-env-template.sh but it doesn't work. Any help is highly appreciated and thanks in advance.
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
08-09-2016
09:43 PM
1 Kudo
@Gulshad Ansari
Perform below action as hdfs user: The output of the fsck above will be very verbose, but it will mention which blocks are corrupt. We can do some grepping of the fsck above so that we aren't "reading through a firehose". hdfs fsck / | egrep -v '^\.+ | grep -v replica | grep -v Replica
or hdfs fsck hdfs://ip.or.host:50070/ | egrep -v '^\.+ | grep -v replica | grep -v Replica This will list the affected files, and the output will not be a bunch of dots, and also files that might currently have under-replicated blocks (which isn't necessarily an issue). The output should include something like this with all your affected files. /path/to/filename.fileextension: CORRUPT blockpool BP-1016133662-10.29.100.41-1415825958975 block blk_1073904305/path/to/filename.fileextension: MISSING 1 blocks of total size 15620361 B The next step would be to determine the importance of the file, can it just be removed and copied back into place, or is there sensitive data that needs to be regenerated? If it's easy enough just to replace the file, that's the route I would take. Remove the corrupted file from your hadoop cluster This command will move the corrupted file to the trash. hdfs dfs -rm /path/to/filename.fileextension hdfs dfs -rm hdfs://ip.or.hostname.of.namenode:50070/path/to/filename.fileextension Or you can skip the trash to permanently delete (which is probably what you want to do) hdfs dfs -rm -skipTrash /path/to/filename.fileextension hdfs dfs -rm -skipTrash hdfs://ip.or.hostname.of.namenode:50070/path/to/filename.fileextension Link As a hdfs user If you run below command it will delete all under replicated and corrupted blocks, instead of following above by doing individually. hdfs fsck / -delete
... View more
08-05-2016
02:24 PM
@Sushant Bharti Try to increase the Virtual Machine RAM memory as shown in below link: https://hortonworks.com/wp-content/uploads/2015/05/Import_on_Vbox_5_11_2015.pdf
... View more
08-05-2016
03:24 AM
Hi All, Has anyone installed and configured Kylin with the secured(Kerberos) cluster either with HDP 2.3 or 2.4? If yes, can you please share the complete process/steps? Even I want to know how we connect to HBase using Kylin. Is Hortoworks supporting Kylin configuration? Thanks in advance
... View more
Labels:
- Labels:
-
Apache HBase
08-04-2016
03:08 AM
@Amila De Silva If you are looking for application logs: yarn logs -applicationId <application ID> Example: yarn logs -applicationId application_1470266735999_002
... View more
08-04-2016
02:59 AM
@Anil Khiani Can you please check export PIG_CLASSPATH=$HADOOP_HOME/conf/
... View more