Member since
03-14-2016
4721
Posts
1111
Kudos Received
874
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2829 | 04-27-2020 03:48 AM | |
| 5502 | 04-26-2020 06:18 PM | |
| 4682 | 04-26-2020 06:05 PM | |
| 3712 | 04-13-2020 08:53 PM | |
| 5618 | 03-31-2020 02:10 AM |
09-02-2019
01:45 AM
@CoPen Your ambari-agent.ini file is pointing to incorrect ambari server . (it should not have IP Address and hostname together) You have the following entry [server]
hostname=192.168.56.101 master master.hadoop.com Ideally it should be following: [server]
hostname=master.hadoop.com
... View more
09-02-2019
01:41 AM
@peter_svarc You can open the link : https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.4/index.html Then at the Top left corner you can change the desired version "3.1.4 / 3.1.0 ...etc" . If your question is answered then, Please make sure to mark the answer as the accepted solution. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
09-01-2019
10:14 PM
1 Kudo
@yukti You seems to be using incorrect version of "hive-exec" / "hive-metastore" JAR. Can you please tell me from where did you get the 3.1.0.;0-78 version of JARs (as your Kafka jar version is"kafka-handler-3.1.0.3.1.0.0-78.jar" so looks like you might have downloaded that JAR from HDP 3.1 installation? Is that correct? If yes then can you also take the hive-metastore JAR of the same version and then try. You can see the difference here between "hive-metastore-3.1.0.3.1.0.0-78.jar" and "hive-metastore-0.9.0.jar" JARs. From my HDP 3.1 installation. # ls -l /usr/hdp/3.1.0.0-78/hive/lib/hive-metastore.jar
lrwxrwxrwx. 1 root root 35 Feb 22 2019 /usr/hdp/3.1.0.0-78/hive/lib/hive-metastore.jar -> hive-metastore-3.1.0.3.1.0.0-78.jar
# /usr/jdk64/jdk1.8.0_112/bin/javap -cp /usr/hdp/current/hive-metastore/lib/hive-standalone-metastore-3.1.0.3.1.0.0-78.jar org.apache.hadoop.hive.metastore.DefaultHiveMetaHook
Compiled from "DefaultHiveMetaHook.java"
public abstract class org.apache.hadoop.hive.metastore.DefaultHiveMetaHook implements org.apache.hadoop.hive.metastore.HiveMetaHook {
public org.apache.hadoop.hive.metastore.DefaultHiveMetaHook();
public abstract void commitInsertTable(org.apache.hadoop.hive.metastore.api.Table, boolean) throws org.apache.hadoop.hive.metastore.api.MetaException;
public abstract void preInsertTable(org.apache.hadoop.hive.metastore.api.Table, boolean) throws org.apache.hadoop.hive.metastore.api.MetaException;
public abstract void rollbackInsertTable(org.apache.hadoop.hive.metastore.api.Table, boolean) throws org.apache.hadoop.hive.metastore.api.MetaException;
} . On the other have we can see that "hive-metastore-0.9.0.jar" jar does not contains that class. # /usr/jdk64/jdk1.8.0_112/bin/javap -cp /tmp/hive-metastore-0.9.0.jar org.apache.hadoop.hive.metastore.DefaultHiveMetaHook
Error: class not found: org.apache.hadoop.hive.metastore.DefaultHiveMetaHook . If your question is answered then, Please make sure to mark the answer as the accepted solution. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
09-01-2019
08:48 PM
@yukti You seems to be using OLD "hive-metastore-0.9.0.jar" JAR file. When you are using it with "kafka-handler-3.1.0.3.1.0.0-78.jar" jar. Do you see the mentioned class inside your JAR? # javap -cp /PATH/TO/hive-metastore-0.9.0.jar org.apache.hadoop.hive.metastore.DefaultHiveMetaHook As i see your Kafka JAR version as "kafka-handler-3.1.0.3.1.0.0-78.jar" Which has a Dependency to . hive-exec module of the same version i guess. Just try to look at the "kafka-handler-3.1.0.3.1.0.0-78.jar"/META-INF/maven/org.apache.hive/kafka-handler/pom.xml" . file and then you will see that it needs the same verison of hive-exec # grep 'hive-exec' -A2 META-INF/maven/org.apache.hive/kafka-handler/pom.xml
<artifactId>hive-exec</artifactId>
<scope>provided</scope>
<version>${project.version}</version> So i guess you should be using the following JAR (if you are using HDP installation) /usr/hdp/current/hive-metastore/lib/hive-standalone-metastore-3.1.0.3.1.0.0-78.jar
OR
/usr/hdp/current/hive-metastore/lib/hive-exec-3.1.0.3.1.0.0-78.jar
... View more
08-28-2019
11:53 PM
1 Kudo
There is no such option do downgrade HDP version out of the box. Also we do not see any HDP 2.7 version released. Also HDP 3.x and HDP 2.6 has major differences in terms of components. Any specific reason you are looking out for downgrade? If this is a freshly build cluster on HDP 3. and you want to use HDP 2.6 then better to freshly install HDP 2.6 which will be lot time and effort saving than manually fixing and downgrading all the components and configs.
... View more
08-28-2019
06:14 AM
@sampathkumar_ma We see the error caused by as following: Caused by: java.lang.IllegalArgumentException
at java.nio.Buffer.limit(Buffer.java:275)
at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:365) So can you please let us know which JDK are you using to run the spark? # ps -ef | grep -i spark
# java -version .Since when are you noticing this error? Any recent changes made to the host/config?
... View more
08-28-2019
06:07 AM
1 Kudo
@Manoj690 Are you sure that the NameNode is running on "localhost" (where you are opening the mentioned URL in the browser) ? 1. Can you specify the namenode IP Address/ Hostname in the URL instead of "localhost" ? 2. Can you also check if the NameNode is listening on port 50070 ? (is that port opened and firewall is disabled) on NameNode host? # netstat -tnlpa | grep 50070
# service iptables stop 3. Please check if you are able to telnet NamerNode host=name & port from the machine where you are running the Browser? # telnet $NAMENODE_HOST 50070
(OR)
# mc -v $NAMENODE_HOST 50070 4. Check and share the NameNode log. Usually it can be found inside the " /var/log/hadoop/hdfs/hadoop-hdfs-namenode-xxxxxxxxxxxxxxxxxx.log "
... View more
08-28-2019
05:51 AM
@Manoj690 Try this: First switch to "root" user using "su - " then from "root" user account run the "su - hdfs" command. # su -
# su - hdfs
... View more
08-28-2019
05:44 AM
1 Kudo
@Manoj690 Looks like your previous error is resolved. For the new error better you shoudl have opened a new thread to avoid confusion for other readers of the thread. I see the new error is HDFS related (and completely unrelated to Mysql related error which you originally reported to this thread) 19/08/28 18:06:43 ERROR tool.ImportTool: Import failed: org.apache.hadoop.security.AccessControlException: Permission denied: user=gaian, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x For the HDFS error you will need to make sure that the user "gaian" who is running the sqoop job does not have thw write permission on . "/user" directory. You might be able fix it by doing this: # su - hdfs
# hdfs dfs -mkdir /user/gaian
# hdfs dfs -chown -R gaian:hadoop /user/gaian
# hdfs dfs -chmod -R 755 /user/gaian . If your question is answered then, Please make sure to mark the answer as the accepted solution. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
08-28-2019
05:29 AM
1 Kudo
@Manoj690 As i mentioned in the other thread opened by you for the similar error. https://community.cloudera.com/t5/Support-Questions/Sqoop-jdbc-error/m-p/269108#M206636 As your latest error is ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Access denied for user 'ambari1'@'localhost' (using password: NO) . Which means you will need to make sure following 2 things: 1. First thing is that you are entering correct password for "ambari1" user in the sqoop command . using "--password" option. 2. Also make sure to add the Grant as following for the root user in MySQL DB so that it can connect to mysql from the host where you are running the sqoop command. Example: mysql> GRANT ALL PRIVILEGES ON *.* TO 'ambari1'@'localhost' IDENTIFIED BY 'XXXXXXXXXX' WITH GRANT OPTION;
mysql> FLUSH PRIVILEGES; Please replace the "XXXXXXXXXX' in the above command with your ambari1 password. Please share the output of the following command shows the entry for 'amabri1' user with host 'localhost' # mysql -u root -p
Enter Password: <YOUR_PASSWORD>
mysql> use mysql;
mysql> select user, host FROM user;
... View more