Member since
03-14-2016
4721
Posts
1111
Kudos Received
874
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2461 | 04-27-2020 03:48 AM | |
4910 | 04-26-2020 06:18 PM | |
3986 | 04-26-2020 06:05 PM | |
3239 | 04-13-2020 08:53 PM | |
4950 | 03-31-2020 02:10 AM |
10-19-2018
05:11 AM
@Shivam Aggarwal As you are getting the message like "Not a valid jar" which mostly indicates a corrupted JAR issue. The easiest way to verify if the JAR is corrupted or not is to run the following command to see if "jar" command is able to list it's content or not? # $JAVA_HOME/bin/jar -tvf /path/to/your_jar.jar . If you see the same message while running the above jar command as well then it means your JAR is corrupt and you should get a correct JAR. The other option is to compare the "md5sum" of the JAR in working and non working environment (to verify if these jars are bit by bit same or not)? Example: # md5sum /path/to/your_jar.jar .
... View more
10-18-2018
10:23 PM
@Prashant Gupta As we see that ambari triggered the following command which did not get executed. Hence there might be some additional logging happened inside the "resource manager" logs. I will suggest you to please check and share the RM logs it should show the cause of command execution failure. Other option to isolate the issue will be to try restarting the ResourceManager manually using the same command to see if it works? # su - yarn
# ulimit -c unlimited; export HADOOP_LIBEXEC_DIR=/usr/hdp/3.0.0.0-1634/hadoop/libexec && /usr/hdp/3.0.0.0-1634/hadoop-yarn/bin/yarn --config /usr/hdp/3.0.0.0-1634/hadoop/conf --daemon start resourcemanager . If you face any issue then please share the logs.
... View more
10-18-2018
10:18 PM
1 Kudo
@Lok! Reddy We see the failure as following: resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install ambari-infra-solr-client' returned 1. Error: Nothing to do This can happen if due to some reason the "/etc/yum.repos.d/ambari.repo" is missing on the host where the installation is failing. So can you please check the repo file if it exist? # cat
# yum clean all
# yum info ambari-infra-solr-client . After validating the ambari.repo file (and it's baseurl access from the host) you can try validating it by running the install command manually on your own (or continue your previous operartion from Ambart UI) # yum install ambari-infra-solr-client .
... View more
10-18-2018
11:38 AM
1 Kudo
@HENI MAHER Please make sure that you are entering correct password for "root" user. # mysql -u root -p
Enter Password: <Enter_root_password> After that you will need to enter the following: mysql> use mysql;
mysql> GRANT ALL PRIVILEGES ON *.* TO 'root'@'%' IDENTIFIED BY '<Enter_root_password>';
mysql> GRANT ALL PRIVILEGES ON *.* TO 'root'@'worker1.hadoop.com' IDENTIFIED BY '<Enter_root_password>';
mysql> GRANT ALL PRIVILEGES ON *.* TO 'root'@'worker2.hadoop.com' IDENTIFIED BY '<Enter_root_password>';
mysql> FLUSH PRIVILEGES; You can check MySQL DB' "user" table to see if "root" user has access to see the DB from host 'worker1.hadoop.com' or not by running the following query: mysql> SELECT User, Host FROM user;
+------+--------------------+
| User | Host |
+------+--------------------+
| hive | % |
| root | 127.0.0.1 |
| root | ::1 |
| root | worker1.hadoop.com |
| root | worker2.hadoop.com |
| root | localhost |
+------+--------------------+ .
... View more
10-18-2018
05:09 AM
@Aditya Sirna As the stascktrace is coming from Hadoop APIs directly. So it will be better to isolate the issue firts. (if the problem is from Spark Config side or from HDFS itself) Error org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby So have you tried running the simple HDFS commands to see if those are also returning the same exception of different? # su - hdfs -c "hdfs dfs -ls /user" . If you see the same message then try to restart the HDFS service once and then try again. If you notice the same issue again like "if it fails to start when namenode on host1 is standby." then the NameNode and ZKFC logs might give us more idea. Also it will be good to check if the NameNodes have enough memory like RAM and heap setup properly? .
... View more
10-18-2018
02:51 AM
@Surendra Ravella Reference: https://community.hortonworks.com/content/supportkb/49610/kerberised-namanode-fails-with-connection-refused.html
... View more
10-18-2018
02:47 AM
@Surendra Ravella After how much time do you see "Receive timed out" message? In your java code can you enable Kerberos Debug options? # java -Dsun.security.krb5.debug=true YourJavaCode . Also can you please check if you are passing the correct details about the KDC ? Can you check if you are using correct "/etc/krb5.conf" in the path? Can you also check if you are setting this parameter in your kewrberos config? This option forces the communication channel to use TCP instead of UDP. udp_preference_limit = 1 If possible can you please share the code snippet? Also the output of the following command from the host where you are running the Java code # telnet $KDC_HOST 80
... View more
10-18-2018
02:37 AM
@Sivakumar Mahalingam Notice "-rw-r--r--" which indicate that the supplied path is a File (Not a Directory) Thats the reason you got an error earlier like: hdfs://quickstart.cloudera:8020/etc/hive/data/weather is not a directory or unable to create one . You have the mentioned PATH as a "file" it should be directory. Try running the following command and then # hdfs dfs -ls /etc/hive/data .
... View more
10-18-2018
02:22 AM
@Sivakumar Mahalingam Additionally there are few syntax errors in your Hive Query. Also the "date" is a reserved keyword so you should not use it. Or you will have to apply hive additional params to tell hive to allow using reserved keywords like following: set hive.support.sql11.reserved.keywords=false; Create Directory on HDFS: # su - hdfs -c "hdfs dfs -mkdir -p /user/hive/data/weather"
# su - hdfs -c "hdfs dfs -chown -R hive:hadoop /user/hive/data/weather"
# su - hdfs -c "hdfs dfs -chmod -R 777 /user/hive/data/weather" NOTE: Above is just dummy directory creation instructions ... you should change the permission based on your requirement. The try creating the table as following: CREATE TABLE weather ( wban int, date1 String, precip int)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
LOCATION '/user/hive/data/weather'; .
... View more
10-18-2018
02:01 AM
@Sivakumar Mahalingam The PATH is not valid. It is expecting the path to be on HDFS so please verify if the PATH exist on hdfs? # su - hdfs -c "hdfs dfs -ls /etc/hive/data/weather" .
... View more