Member since
09-21-2016
27
Posts
8
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3713 | 03-23-2017 06:11 PM | |
2050 | 12-29-2016 03:00 PM |
07-19-2018
02:51 PM
Never mind it works ,if add empty line between every package in PACKAGES file
... View more
07-19-2018
01:54 PM
I have followed the above steps keep getting the following error install.packages("magrittr", repos = "/tmp/r-packages/") Installing package into ‘/usr/lib64/R/library’
(as ‘lib’ is unspecified) Warning message: package ‘magrittr’ is not available (for R version 3.4.1)
... View more
11-02-2017
02:46 PM
Could you share the nifi template for this flow
... View more
09-29-2017
07:00 PM
1 Kudo
Make sure u have following dependencies based on your hdp version. <dependencies> <dependency> <groupId>org.apache.hive</groupId> <artifactId>hive-exec</artifactId> <version>1.2.1000.2.5.3.0-37</version> </dependency> <dependency> <groupId>org.apache.hive</groupId> <artifactId>hive-jdbc</artifactId> <version>1.2.1000.2.5.3.0-37</version> </dependency> <dependency> <groupId>org.apache.hive</groupId> <artifactId>hive-metastore</artifactId> <version>1.2.1000.2.5.3.0-37</version> </dependency> <dependency> <groupId>org.apache.hive</groupId> <artifactId>hive-service</artifactId> <version>1.2.1000.2.5.3.0-37</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>2.7.3.2.5.3.0-37</version> </dependency> </dependencies> following is the example code org.apache.hadoop.conf.Configuration conf = new org.apache.hadoop.conf.Configuration(); conf.set("hadoop.security.authentication", "Kerberos"); UserGroupInformation.setConfiguration(conf); UserGroupInformation.loginUserFromKeytab("hive/xx@HDP.COM", "/tmp/hive.service.keytab"); Class.forName("org.apache.hive.jdbc.HiveDriver"); System.out.println("getting connection"); Connection con = DriverManager.getConnection("jdbc:hive2://<host>:10001/;principal=hive/xx@HDP.COM;transportMode=http;httpPath=cliservice");
... View more
09-29-2017
06:21 PM
3 Kudos
You can use the following cmd oozie jobs -filter status=FAILED -len 1000 -oozie http://localhost:11000/oozie
... View more
08-28-2017
09:55 PM
in your /home/<user>/.storm/storm.yaml file,need to specify following property supervisor.run.worker.as.user : true
... View more
03-23-2017
09:34 PM
i am not sure about unix command,what is the time duration u see in resourcemanager url
... View more
03-23-2017
09:12 PM
You can check for empty patition using following code lines.dstream().foreachRDD(rdd => { if(!rdd.partitions.isEmpty)
rdd.saveAsTextFile(outputDir)
})
... View more
03-23-2017
06:11 PM
You can view the job completion details if u launch <resourcemangerhost>:8080/cluster/apps/FINISHED look for the respective Mapreduce associated to the sqoop job.
... View more