Member since
05-02-2017
88
Posts
173
Kudos Received
15
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 7405 | 09-27-2017 04:21 PM | |
| 3392 | 08-17-2017 06:20 PM | |
| 3081 | 08-17-2017 05:18 PM | |
| 3660 | 08-11-2017 04:12 PM | |
| 5145 | 08-08-2017 12:43 AM |
06-06-2017
05:35 AM
@arjun more You can use below commands for adding user directory in hdfs, su - <HDFS_USER> hdfs dfs -mkdir /user/yarn hdfs dfs -chown -R yarn:hdfs /user/yarn This will create the user in HDFS.
... View more
06-06-2017
05:04 AM
5 Kudos
@arjun more You can follow below steps to get it resolved, - Add the following classpath in file "/etc/sqoop/conf/sqoop-env.sh" export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/hdp/current/hive-client/lib/* - Make sure you have installed the Hive client in the node where Oozie server is running. - Add hive-exec jar in oozie lib - Also make sure yarn user is present in HDFS(/user/yarn directory). And restart Oozie server and try again.
... View more
06-06-2017
03:55 AM
3 Kudos
@white wartih Can you check whether your Ambari-Metrics collector process is running or not. On the same Host where you have installed Metrics collector, Execute the following command, # netstat -tulpn | grep 6188 If the above command executed with null value, please check the ambari-metrics collector server logs in /var/log/ambari-metrics-collector/ directory. Also try to restart the Ambari Metrics collector and try again.
... View more
06-01-2017
09:03 AM
@Jay SenSharma Thanks.
... View more
06-01-2017
07:37 AM
4 Kudos
@arjun more
This might help you please try it, There should be a table named by namespace for maintaining the inforamtion of tables, which is already exists as your error states. While starting the process HMaster it will create the namespace directory under /hbase directory So thats why it i showing the Table Exists Exception. We have to manually repair the Hbase Metastore by using offline command as, $HBASE_HOME/bin/hbase org.apache.hadoop.hbase.util.hbck.OfflineMetaRepair Start Hbase. If it is not working use the zookeeper client for removing it from the Hbase Master host, $ZK_HOME/bin/zkCli.sh -server <ZK_SERVER>
-> ls /
-> rmr /hbase
-> exit
... View more
05-31-2017
03:30 AM
@PJ This might be problem of your network, due to which heartbeat is interrupting. You can decommission it and commission it back one by one.
... View more
05-29-2017
08:33 AM
3 Kudos
@ed day This Error "Access denied for user 'hive'@'s2.royble.co.uk' (using password: YES)" itself states that you are entering wrong password for hive user crerated in Mysql. To solve this Login to mysql mysql> drop database hive; And retry the installation of Hive service in ambari.
... View more
05-29-2017
02:11 AM
@Mathi Murugan It will not result in data loss, Can you try setting above properties and check again.
... View more
05-29-2017
02:09 AM
@punit If your Hive JDBC problem is solved, Please accept the answer.
... View more
05-26-2017
12:28 PM
3 Kudos
@Hugo Felix Can you put value of following properties in job.properties and try again, Current values in job.properties,
job-tracker=hdfs://192.168.0.73:8021 jobTracker=hdfs://192.168.0.73:8021 Replace with
job-tracker=192.168.0.73:<PORT> jobTracker=192.168.0.73:<PORT> where <PORT>= Value from Ambari UI -> YARN ->yarn.resourcemanager.address And then try again.
... View more