Member since
10-10-2017
15
Posts
0
Kudos Received
0
Solutions
11-22-2018
06:58 AM
Thanks for the Info! Best Regards, Gagan
... View more
11-21-2018
12:17 PM
Dear Team, I would like to know if Hortonworks is already certified with Industry security standards? What all certification Hortonworks has for its products for security? (Is it maintained somewhere?) Thanks and Best Regards, Gagandeep Singh
... View more
- Tags:
- certificate
- Security
Labels:
09-19-2018
08:15 AM
Thanks for the detailed answer, it is very helpful! BR//Gagan
... View more
09-18-2018
10:35 AM
Dear Team, Can you be so kind to help me with the location of script which creates AD accounts during automated kerberos setup via ambari. (AD team wants to review before giving us write access) I looked at /var/lib/ambari-server/resources/scripts/kerberos_setup.sh but could not understand where we create and delete AD users. Thanks and Best Regards, Gagan
... View more
Labels:
- Labels:
-
Apache Ambari
-
Kerberos
03-20-2018
09:13 AM
Dear Saumil, The mention method works as a workaround and not exact solution, so i am looking for a better solution. Thanks, Gagan
... View more
03-05-2018
01:25 PM
Hello, I am facing an issue: currently my OS and hdp is running in CET. When i try to submit a oozie job with following parameters startTime=2018-03-05T14\:20Z endTime=2099-01-01T00\:01Z timezone=Europe/Berlin then my job gets started at 2018-03-05 15:20Z . This means that it adds 1 hour to my system time. This is wrong as my system time is in CET. It seems that oozie does not know the time it is working in. How to tell oozie that the local timezone is in CET. Thanks and Best Regards, Gagan
... View more
Labels:
- Labels:
-
Apache Oozie
02-05-2018
04:34 PM
Dear Reda, I see the jar present in my oozie share lib /user/bdd01oozie/share/lib/lib_20170829184616/hcatalog/hive-hcatalog-core-1.2.1000.2.6.1.0-129.jar So it should work as i am using the the properties : oozie.use.system.libpath=true oozie.libpath=${nameNode}/user/bdd01oozie/share/lib Do i still need to add the jar to HADOOP_CLASSPATH? Thanks and Best Regards, Gagan
... View more
02-05-2018
01:46 PM
Dear All, I have following sqoop job which uses hcatalog import as orc file. This jobs runs fine from console but does not run from oozie sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true \
-Dhadoop.security.credential.provider.path=jceks://hdfs/******.jceks \
--connect jdbc:sqlserver://abc.domain.com \
--username <username> \
--password-alias alias.password \
--table <tablename> \
--hcatalog-database <hive database> \
--hcatalog-table <hive table> \
--hcatalog-storage-stanza "stored as orcfile" <?xml version="1.0" encoding="UTF-8"?>
<workflow-app xmlns="uri:oozie:workflow:0.5" name="OZ_proj_Load">
<global>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
</global>
<start to="sqoopTableLoad"/>
<action name="sqoopTableLoad">
<sqoop xmlns="uri:oozie:sqoop-action:0.4">
<arg>import</arg>
<arg>-Dorg.apache.sqoop.splitter.allow_text_splitter=true</arg>
<arg>-Dhadoop.security.credential.provider.path=jceks://hdfs//******.jceks</arg>
<arg>--connect</arg>
<arg>jdbc:sqlserver://abc.domain.com</arg>
<arg>--username</arg>
<arg>username</arg>
<arg>--password-alias</arg>
<arg>alias.password</arg>
<arg>--table</arg>
<arg>tablename</arg>
<arg>--hcatalog-database</arg>
<arg>hivedatabase</arg>
<arg>--hcatalog-table</arg>
<arg>hivetable</arg>
<arg>--hcatalog-storage-stanza</arg>
<arg>"stored as orcfile"</arg>
</sqoop>
<ok to="end"/>
<error to="killSqoopLoad"/>
</action>
<kill name="killSqoopLoad">
<message>The workflow failed at Sqoop, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app>
I receive the following error. Any help is appreciated. Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], main() threw exception, org/apache/hive/hcatalog/mapreduce/HCatOutputFormat
java.lang.NoClassDefFoundError: org/apache/hive/hcatalog/mapreduce/HCatOutputFormat
at org.apache.sqoop.mapreduce.DataDrivenImportJob.getOutputFormatClass(DataDrivenImportJob.java:199)
at org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:263)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.SQLServerManager.importTable(SQLServerManager.java:163)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:179)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:58)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:237)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: java.lang.ClassNotFoundException: org.apache.hive.hcatalog.mapreduce.HCatOutputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... View more
Labels:
- Labels:
-
Apache HCatalog
-
Apache Oozie
-
Apache Sqoop
01-10-2018
08:35 AM
This is very much the same i researched too. So i go with distcp for my usecase.
... View more
01-09-2018
05:01 PM
Hello All, I have a requirement where i want to copy files from one hdfs directory to another via oozie in same cluster. This can be done using oozie discp action or oozie shell action. Which is a better way to copy files using oozie. I guess it is similar as asking hdfs -cp vs distcp? Thanks and Best Regards, Gagan
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Oozie
10-25-2017
08:45 AM
Dear All, I want to transfer data from Kafka to HDFS. I searched online and found that this can be done through camus and gobblin. Would like to know if there is a default HDFS connector that comes with HDP 2.6 which is ready to use with minimal coding/configuration. And a useful link on how to use it. Thanks in advance. Best Regards, Gagan
... View more
Labels:
10-11-2017
03:31 PM
Thanks! This helped.
... View more
10-11-2017
12:54 PM
Dear Team, How do i know list of SparkR libraries currently available in my cluster? (i don't have much knowledge on SparkR or R) Is there a command available to list available packages? I am looking for a list like below : Library->version bit64->x.x.x bmp->x.x.x forecast->x.x.x GGally->x.x.x etc. Thanks in advance. Best Regards, Gagan
... View more
Labels: