Member since
02-02-2016
583
Posts
518
Kudos Received
98
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 4188 | 09-16-2016 11:56 AM | |
| 1748 | 09-13-2016 08:47 PM | |
| 6940 | 09-06-2016 11:00 AM | |
| 4170 | 08-05-2016 11:51 AM | |
| 6244 | 08-03-2016 02:58 PM |
05-22-2016
11:00 PM
@ammu chCan you please share the error logs your getting while running query through beeline? Also please try to increase the Hiveserver2 heap size and see if you face similar issue.
... View more
05-22-2016
09:31 PM
@Anandha L Ranganathan Setting hive.server2.enable.doAs=false will always work since jobs will run under the ownership of user who is owning the HS2 process. If everything is perfect from confs side regards to impersonation then if possible it worth to restart the cluster processes and see if that resolve the issue.
... View more
05-21-2016
10:55 PM
@Montrial Harrell Please check for core-site.xml also. grep -iR core-site.xml /usr/hdp/<version>/pig/ If you don't see any jar file in the output then I think you can try copying these two xml from /etc/hadoop/conf to pig/conf dir and see if that resolve the issue.
... View more
05-21-2016
10:42 PM
@Montrial Harrell Thanks for sharing that info, can you please run below command and see which pig jar having these conf files. grep -iR hadoop-site.xml /usr/hdp/<version>/pig/
... View more
05-21-2016
10:05 PM
2 Kudos
@Montrial Harrell Seems like pig doesn't know where to get conf files, Can you please set below env properties inside pig-env.sh and run again? export HADOOP_CONF_DIR=$HADOOP_CONF_DIR:/etc/hadoop/conf export PIG_CLASSPATH=$PIG_CLASSPATH:$HADOOP_CONF_DIR Also please let us know your hdp version and did you changed any properties in cluster recently? I think this shouldn't be default behavior.
... View more
05-21-2016
02:33 PM
I'm saying Ambari server and agent process will start by ini.d script and I think this has been implemented long back https://issues.apache.org/jira/browse/AMBARI-1492 . The rest HDP components services like NN,SNN DN etc should start manually through Ambari UI or API's.
... View more
05-21-2016
01:43 PM
1 Kudo
@Sridhar Bandaru
Ambari server and agent should start after OS reboot, as far as concern about hadoop services we recommend to use Ambari UI or API's. Also there are some services to collect metrics from OS and Hadoop will start automatically. Hope that make sense. Thanks
... View more
05-21-2016
10:34 AM
@Sunile Manjee You can put all the variable inside .hiverc file which will automatically call while running hive shell. And yes you have to replicate this file to all user home directory or in the directory from where they run hive shell. Moreover I don't think we can access the modified value of a variable from one hive shell session to another( I didn't tested).
... View more