Member since
04-25-2016
579
Posts
609
Kudos Received
111
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1962 | 02-12-2020 03:17 PM | |
1356 | 08-10-2017 09:42 AM | |
10033 | 07-28-2017 03:57 AM | |
2116 | 07-19-2017 02:43 AM | |
1647 | 07-13-2017 11:42 AM |
10-07-2016
04:20 PM
4 Kudos
use the following command to list all the services installed on your node in cluster curl -u admin:admin -H "X-Requested-By: ambari" -X GET http://AMBARIHOSTNAME:8080/api/v1/clusters/CLUSTERNAME/hosts/HOST then delete the client you want using following command curl -u admin:admin -H "X-Requested-By: ambari" -X DELETE http://AMBARI_SERVER_HOST:8080/api/v1/clusters/CLUSTERNAME/hosts/HOSTNAME/host_components/DATANODE
... View more
10-01-2016
03:38 AM
2 Kudos
please check RM queues setup correctly, seems there is no default queue configured
... View more
09-26-2016
05:57 PM
@Sunile Manjee I did not find it on some link, I have generated the ER design diagram using mysql workbench using hive metastore schema.
... View more
09-21-2016
02:39 PM
3 Kudos
Env: HDP-2.3.4.0-3485 Java 8 Attached code contain: —pom.xml to manage all the dependencies — HiveClientSecure.java - oozie java action to be configured into workflow.xml — jaas.conf — oozie uses jaas configuration for kerberos login — log4j.properties - to capture logs jaas.conf: modify principal name and key tab location accordingly and place it to on each node on the cluster.I have placed it on /tmp/jaas/jaas.conf for testing purpose.
Client {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=trueuseTicketCache=true
principal="ambari-qa-hbase234@HWXBLR.COM"
keyTab="/etc/security/keytabs/smokeuser.headless.keytab"debug="true"doNotPrompt=true;
}; workflow.xml: <workflow-app xmlns="uri:oozie:workflow:0.2" name="java-main-wf"> <start to="java-node"/> <a<action name="java-node">
<java>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property> </configuration>
<main-class>HiveJdbcClientSecure</main-class>
<arg>jdbc:hive2://hb-n2.hwxblr.com:10000/;principal=hive/hb-n2.hwxblr.com@HWXBLR.COM</arg> <arg>ambari-qa-hbase234@HWXBLR.COM</arg>
<arg>/etc/security/keytabs/smokeuser.headless.keytab</arg> </java>
<ok to="end"/> <error to="fail"/> </action> <kill name="fail">
<message>Java failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app> Sample Application Build and Run Instruction. 1. extract attached jar 2. cd HiveServer2JDBCSample 3. mvn clean package(it will create a fat jar with all the dependencies into it.) 4. upload it to hdfs // in my case I am using amberi-qa user which is map to principal defined in workflow xml 5. hadoop fs -put target/HiveServer2JDBCTest-jar-with-dependencies.jar examples/apps/java-main/lib 6. upload workflow xml 7. hadoop fs -put /tmp/workflow.xml examples/apps/java-main/ Run Through oozie. source /etc/oozie/conf/oozie-env.sh ; /usr/hdp/current/oozie-client/bin/oozie job -oozie http://hb-n2.hwxblr.com:11000/oozie -config /usr/hdp/current/oozie-client/doc/examples/apps/java-main/job.properties -run hiveserver2oozieaction.tar.gz
... View more
Labels:
09-21-2016
03:25 AM
you dont need any configuration params and you can submit spark job using curl commands
... View more
09-21-2016
02:56 AM
2 Kudos
you can use spark job rest server which will allow you to submit the job using rest api.. please follow https://github.com/spark-jobserver/spark-jobserver
... View more
09-19-2016
08:57 PM
could you please post code snippet where you are trying to load/read some data .. it looks there is something in file uri starting with C://
... View more
09-19-2016
04:22 PM
@srinivasa rao you are seeing 9 mapper due to tezsplitgrouper which actually groups the no of original splits for better parallelism,this is a nice article explaining how initial task parallelism works https://cwiki.apache.org/confluence/display/TEZ/How+initial+task+parallelism+works
... View more
09-18-2016
01:20 AM
2 Kudos
scala.io.Source.fromFile is expecting a file from local filesystem, if you want to read from the hdfs then use hdfs api to read it like this val file = org.apache.hadoop.fs.FileSystem.get(URI uri,Configuration conf)
val in =file.open(Path path)
....
... View more
09-18-2016
01:02 AM
3 Kudos
@Raja A storm 0.10.0.2.3.4.0-3485 is build with kafka 8.2.2 where SimpleConsumer expect argument String,Int,Int,Int,String,String while 0.9.3.2.2.4.0-2633 is build with 0.8.1.1 where simpleconsumer expect args String,Int,Int,Int,String thats why you are getting NoSuchMethodError.
... View more