Member since
03-14-2016
4721
Posts
1111
Kudos Received
874
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2442 | 04-27-2020 03:48 AM | |
4876 | 04-26-2020 06:18 PM | |
3975 | 04-26-2020 06:05 PM | |
3216 | 04-13-2020 08:53 PM | |
4920 | 03-31-2020 02:10 AM |
09-02-2019
02:41 AM
@tyadav HDP Sandbox is a learning setup which has a single node cluster with lots of services deployed to it. So usually when you say "Start All Services" then it will attempt to start all the services that are not put into maintenance ... Which will create much load on the VM. So ideally it will be best to just STOP the services that you are currently not using and then put them in Maintenance Mode. Then try to start the required services. The Just start the services that you actually need for testing that way the load on the Sandbox host will be much less and you will find that operations will be executing much faster and you will also see some free memory on the sandbox host.
... View more
09-02-2019
01:45 AM
@CoPen Your ambari-agent.ini file is pointing to incorrect ambari server . (it should not have IP Address and hostname together) You have the following entry [server]
hostname=192.168.56.101 master master.hadoop.com Ideally it should be following: [server]
hostname=master.hadoop.com
... View more
09-02-2019
01:41 AM
@peter_svarc You can open the link : https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.4/index.html Then at the Top left corner you can change the desired version "3.1.4 / 3.1.0 ...etc" . If your question is answered then, Please make sure to mark the answer as the accepted solution. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
09-02-2019
01:36 AM
@yvettew Yes, please try to run the same Curl command from the MAC laptop where you are opening Browser. Also please check your browser settings to see if it has any network proxy setting added or not?
... View more
09-01-2019
11:23 PM
@yvettew Can you please share the output of the following curl command from the host where you are running the browser? Just want to see if there are any Network proxy added in front of Ambari which is causing the 502 error? # curl -iLv -u "admin:admin" -H "X-Requested-By: ambari" -X GET http://$AMBARI_HOSTNAME:8080/api/v1/check Please replace the admin:admin with your ambari admin username and password and also the $AMBARI_HOSTNAME with the actual FQDN of ambari server. Please share the output of the above curl command . Also please check your browser setting if it has any Network proxy setting added to it?
... View more
09-01-2019
10:14 PM
1 Kudo
@yukti You seems to be using incorrect version of "hive-exec" / "hive-metastore" JAR. Can you please tell me from where did you get the 3.1.0.;0-78 version of JARs (as your Kafka jar version is"kafka-handler-3.1.0.3.1.0.0-78.jar" so looks like you might have downloaded that JAR from HDP 3.1 installation? Is that correct? If yes then can you also take the hive-metastore JAR of the same version and then try. You can see the difference here between "hive-metastore-3.1.0.3.1.0.0-78.jar" and "hive-metastore-0.9.0.jar" JARs. From my HDP 3.1 installation. # ls -l /usr/hdp/3.1.0.0-78/hive/lib/hive-metastore.jar
lrwxrwxrwx. 1 root root 35 Feb 22 2019 /usr/hdp/3.1.0.0-78/hive/lib/hive-metastore.jar -> hive-metastore-3.1.0.3.1.0.0-78.jar
# /usr/jdk64/jdk1.8.0_112/bin/javap -cp /usr/hdp/current/hive-metastore/lib/hive-standalone-metastore-3.1.0.3.1.0.0-78.jar org.apache.hadoop.hive.metastore.DefaultHiveMetaHook
Compiled from "DefaultHiveMetaHook.java"
public abstract class org.apache.hadoop.hive.metastore.DefaultHiveMetaHook implements org.apache.hadoop.hive.metastore.HiveMetaHook {
public org.apache.hadoop.hive.metastore.DefaultHiveMetaHook();
public abstract void commitInsertTable(org.apache.hadoop.hive.metastore.api.Table, boolean) throws org.apache.hadoop.hive.metastore.api.MetaException;
public abstract void preInsertTable(org.apache.hadoop.hive.metastore.api.Table, boolean) throws org.apache.hadoop.hive.metastore.api.MetaException;
public abstract void rollbackInsertTable(org.apache.hadoop.hive.metastore.api.Table, boolean) throws org.apache.hadoop.hive.metastore.api.MetaException;
} . On the other have we can see that "hive-metastore-0.9.0.jar" jar does not contains that class. # /usr/jdk64/jdk1.8.0_112/bin/javap -cp /tmp/hive-metastore-0.9.0.jar org.apache.hadoop.hive.metastore.DefaultHiveMetaHook
Error: class not found: org.apache.hadoop.hive.metastore.DefaultHiveMetaHook . If your question is answered then, Please make sure to mark the answer as the accepted solution. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
09-01-2019
08:48 PM
@yukti You seems to be using OLD "hive-metastore-0.9.0.jar" JAR file. When you are using it with "kafka-handler-3.1.0.3.1.0.0-78.jar" jar. Do you see the mentioned class inside your JAR? # javap -cp /PATH/TO/hive-metastore-0.9.0.jar org.apache.hadoop.hive.metastore.DefaultHiveMetaHook As i see your Kafka JAR version as "kafka-handler-3.1.0.3.1.0.0-78.jar" Which has a Dependency to . hive-exec module of the same version i guess. Just try to look at the "kafka-handler-3.1.0.3.1.0.0-78.jar"/META-INF/maven/org.apache.hive/kafka-handler/pom.xml" . file and then you will see that it needs the same verison of hive-exec # grep 'hive-exec' -A2 META-INF/maven/org.apache.hive/kafka-handler/pom.xml
<artifactId>hive-exec</artifactId>
<scope>provided</scope>
<version>${project.version}</version> So i guess you should be using the following JAR (if you are using HDP installation) /usr/hdp/current/hive-metastore/lib/hive-standalone-metastore-3.1.0.3.1.0.0-78.jar
OR
/usr/hdp/current/hive-metastore/lib/hive-exec-3.1.0.3.1.0.0-78.jar
... View more
08-28-2019
11:53 PM
1 Kudo
There is no such option do downgrade HDP version out of the box. Also we do not see any HDP 2.7 version released. Also HDP 3.x and HDP 2.6 has major differences in terms of components. Any specific reason you are looking out for downgrade? If this is a freshly build cluster on HDP 3. and you want to use HDP 2.6 then better to freshly install HDP 2.6 which will be lot time and effort saving than manually fixing and downgrading all the components and configs.
... View more
08-28-2019
06:14 AM
@sampathkumar_ma We see the error caused by as following: Caused by: java.lang.IllegalArgumentException
at java.nio.Buffer.limit(Buffer.java:275)
at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:365) So can you please let us know which JDK are you using to run the spark? # ps -ef | grep -i spark
# java -version .Since when are you noticing this error? Any recent changes made to the host/config?
... View more
08-28-2019
06:07 AM
1 Kudo
@Manoj690 Are you sure that the NameNode is running on "localhost" (where you are opening the mentioned URL in the browser) ? 1. Can you specify the namenode IP Address/ Hostname in the URL instead of "localhost" ? 2. Can you also check if the NameNode is listening on port 50070 ? (is that port opened and firewall is disabled) on NameNode host? # netstat -tnlpa | grep 50070
# service iptables stop 3. Please check if you are able to telnet NamerNode host=name & port from the machine where you are running the Browser? # telnet $NAMENODE_HOST 50070
(OR)
# mc -v $NAMENODE_HOST 50070 4. Check and share the NameNode log. Usually it can be found inside the " /var/log/hadoop/hdfs/hadoop-hdfs-namenode-xxxxxxxxxxxxxxxxxx.log "
... View more