Member since
03-14-2016
4721
Posts
1111
Kudos Received
874
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2825 | 04-27-2020 03:48 AM | |
| 5479 | 04-26-2020 06:18 PM | |
| 4661 | 04-26-2020 06:05 PM | |
| 3702 | 04-13-2020 08:53 PM | |
| 5604 | 03-31-2020 02:10 AM |
09-05-2019
12:28 AM
Oh.. Thanks, I upgrade my ambari version and my hostname master with master.hadoop.com togerther!
... View more
09-04-2019
01:27 AM
@sonalidive786 Good to know that it has resolved your issue. If your question is answered then, Please make sure to mark the answer as the accepted solution. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
09-03-2019
01:05 AM
@yukti As you have already got the 3.1.0.0-78 version of JARs so for the missing class "org.apache.hadoop.hive.kafka.KafkaSerDe" can you try using the following JAR (instead of "hive-serde-0.10.0.jar'")? In My HDP 3 installation on hive host i can see it here " usr/hdp/current/hive-server2/lib/kafka-handler-3.1.0.3.1.0.0-78.jar" # /usr/jdk64/jdk1.8.0_112/bin/javap -cp /usr/hdp/current/hive-server2/lib/kafka-handler-3.1.0.3.1.0.0-78.jar org.apache.hadoop.hive.kafka.KafkaSerDe
Compiled from "KafkaSerDe.java"
public class org.apache.hadoop.hive.kafka.KafkaSerDe extends org.apache.hadoop.hive.serde2.AbstractSerDe {
public org.apache.hadoop.hive.kafka.KafkaSerDe();
public void initialize(org.apache.hadoop.conf.Configuration, java.util.Properties) throws org.apache.hadoop.hive.serde2.SerDeException;
public java.lang.Class<? extends org.apache.hadoop.io.Writable> getSerializedClass();
public org.apache.hadoop.io.Writable serialize(java.lang.Object, org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector) throws org.apache.hadoop.hive.serde2.SerDeException;
public org.apache.hadoop.hive.serde2.SerDeStats getSerDeStats();
public java.lang.Object deserialize(org.apache.hadoop.io.Writable) throws org.apache.hadoop.hive.serde2.SerDeException;
java.util.ArrayList<java.lang.Object> deserializeKWritable(org.apache.hadoop.hive.kafka.KafkaWritable) throws org.apache.hadoop.hive.serde2.SerDeException;
public org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector getObjectInspector();
static {};
}
... View more
09-02-2019
02:51 AM
@tyadav Also it will be great to start monitoring the free memory of the Sandbox host where all the components are running so that we will get better idea why the services are starting slow and taking almost 25 minutes time to start/fail to successfully start. If you still find any failure in starting the selected services then please share the logs of those individual service components which are failing to start and also the "free -m" command output during the startup operation. . . If your question is answered then, Please make sure to mark the answer as the accepted solution. If you find a reply useful, say thanks by clicking on the thumbs up button.
... View more
09-02-2019
01:49 AM
Oh, I see now. There seems to be problem on the server side when rewriting the docs.hortonworks.com domain to docs.cloudera.com, so the /HDPDocuments appears twice in the URL. Once I'm on the docs.cloudera.com domain, the links are working once again...
... View more
08-28-2019
11:53 PM
1 Kudo
There is no such option do downgrade HDP version out of the box. Also we do not see any HDP 2.7 version released. Also HDP 3.x and HDP 2.6 has major differences in terms of components. Any specific reason you are looking out for downgrade? If this is a freshly build cluster on HDP 3. and you want to use HDP 2.6 then better to freshly install HDP 2.6 which will be lot time and effort saving than manually fixing and downgrading all the components and configs.
... View more
08-28-2019
06:07 AM
1 Kudo
@Manoj690 Are you sure that the NameNode is running on "localhost" (where you are opening the mentioned URL in the browser) ? 1. Can you specify the namenode IP Address/ Hostname in the URL instead of "localhost" ? 2. Can you also check if the NameNode is listening on port 50070 ? (is that port opened and firewall is disabled) on NameNode host? # netstat -tnlpa | grep 50070
# service iptables stop 3. Please check if you are able to telnet NamerNode host=name & port from the machine where you are running the Browser? # telnet $NAMENODE_HOST 50070
(OR)
# mc -v $NAMENODE_HOST 50070 4. Check and share the NameNode log. Usually it can be found inside the " /var/log/hadoop/hdfs/hadoop-hdfs-namenode-xxxxxxxxxxxxxxxxxx.log "
... View more
08-28-2019
05:51 AM
@Manoj690 Try this: First switch to "root" user using "su - " then from "root" user account run the "su - hdfs" command. # su -
# su - hdfs
... View more
08-27-2019
11:35 PM
After following your suggestion, the problem seems to have been solved.
... View more