Member since
12-23-2018
5
Posts
0
Kudos Received
0
Solutions
07-21-2019
03:25 PM
HI @Sindhu , the following command work well and show list databases. sqoop list-databases --connect 'jdbc:sqlserver://192.168.40.21:1433;useNTLMv2=true;databasename=DataDB' --connection-manager org.apache.sqoop.manager.SQLServerManager --username 'DataApp' --password 'BD@t@' but when i run command by the following: sqoop export --connect "jdbc:sqlserver://192.168.41.210:1433;useNTLMv2=true;domain=192.168.40.21;databaseName=DataDB" --table "TblBatch" --hcatalog-database default --hcatalog-table TblBatch --connection-manager org.apache.sqoop.manager.SQLServerManager --username DataApp --password 'BD@t@' --update-mode allowinsert --verbose i have this error: 19/07/21 12:15:01 ERROR orm.CompilationManager: It seems as though you are running sqoop with a JRE. 19/07/21 12:15:01 ERROR orm.CompilationManager: Sqoop requires a JDK that can compile Java code. 19/07/21 12:15:01 ERROR orm.CompilationManager: Please install a JDK and set $JAVA_HOME to use it. 19/07/21 12:15:01 ERROR tool.ExportTool: Encountered IOException running export job: java.io.IOException: Could not start Java compiler. at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:196) at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107) at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:77) at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:113) at org.apache.sqoop.Sqoop.run(Sqoop.java:150) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:186) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:240) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:249) at org.apache.sqoop.Sqoop.main(Sqoop.java:258)
... View more
02-18-2019
11:25 AM
i configured hive hook for atlas in hive-site.xml by the following: atlas.hook.hive.synchronous = true hive.exec.post.hooks = org.apache.hadoop.hive.ql.hooks.ATSHook,org.apache.atlas.hive.hook.HiveHook atlas.cluster.name = dpf i use ambari for configuration two kafka topic is created: ATLAS_ENTITIES ATLAS_HOOK and when i created the table in hive for example: create table br(full_name string, ssn string, location string); and then i run this command: /usr/hdp/2.6.5.0-292/kafka/bin/kafka-console-consumer.sh --zookeeper dlm01.sic:2181,dlm02.sic:2181,dlm03.sic:2181 --topic ATLAS_HOOK --from-beginning i see the json format of text for create table br. this means hive send data to ATLAS_HOOK topic and it works correctly. but when i run this command: /usr/hdp/2.6.5.0-292/kafka/bin/kafka-console-consumer.sh --zookeeper dlm01.sic:2181,dlm02.sic:2181,dlm03.sic:2181 --topic ATLAS_ENTITIES --from-beginning the didn't see anything about br table. why table not appear in atlas hive-tables ? what is my wrong? notice: when i manually import hive metadata in atlas with import-hive.sh i see the information about br table in two kafka topics(ATLAS_ENTITIES, ATLAS_HOOK ) and br table appear in atlas. but automatically not work
... View more
Labels:
- Labels:
-
Apache Atlas
-
Apache Hive
-
Apache Kafka
12-30-2018
03:42 PM
I solved my problem by the following: 1- use ( --hcatalog-home /usr/hdp/current/hive-webhcat ) in command tag in workflow.xml: <?xml version="1.0" encoding="UTF-8" standalone="no"?>
<workflow-app xmlns="uri:oozie:workflow:0.5" name="loadtosql">
<start to="sqoop_export"/>
<action name="sqoop_export">
<sqoop xmlns="uri:oozie:sqoop-action:0.4">
<job-tracker>${resourceManager}</job-tracker>
<name-node>${nameNode}</name-node>
<command>export --connect jdbc:sqlserver://x.x.x.x:1433;useNTLMv2=true;databasename=BigDataDB --connection-manager org.apache.sqoop.manager.SQLServerManager --username DataApp--password D@t@User --table tr1 --hcatalog-home /usr/hdp/current/hive-webhcat --hcatalog-database temporary --hcatalog-table daily_tr </command>
<file>/user/ambari-qa/test/lib/hive-site.xml</file>
<file>/user/ambari-qa/test/lib/tez-site.xml</file>
</sqoop>
<ok to="end"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>${wf:errorMessage(wf:lastErrorNode())}</message>
</kill>
<end name="end"/>
</workflow-app>
2- on the hdfs create lib folder beside workflow.xml and put hive-site.xml and tez-site.xml to that (upload hive-site.xml from /etc/hive/2.6.5.0-292/0/ and tez-site.xml from /etc/tez/2.6.5.0-292/0/ to lib folder on hdfs) according to above in workflow define two files (hive-site.xml and tez-site.xml) <file>/user/ambari-qa/test/lib/hive-site.xml</file>
<file>/user/ambari-qa/test/lib/tez-site.xml</file> 3- define the following property in job.properties file: oozie.action.sharelib.for.sqoop=sqoop,hive,hcatalog 4- Make sure oozie-site.xml under /etc/oozie/conf has the following property
specified. <property>
<name>oozie.credentials.credentialclasses</name>
<value>hcat=org.apache.oozie.action.hadoop.HCatCredentials</value>
</property>
... View more
12-25-2018
02:22 PM
@Ramil Akhmadeev @Sara Alizadeh @Jay Kumar SenSharma @Roberto Sancho @cnormile I have problem when use sqoop actin with hcatalog in ambary-views. The following command
run in shell correctly and it works so good. sqoop export --connect 'jdbc:sqlserver://x.x.x.x:1433;useNTLMv2=true;databasename=BigDataDB' --connection-manager org.apache.sqoop.manager.SQLServerManager --username 'DataApp' --password
'D@t@User' --table tr1 --hcatalog-database temporary --hcatalog-table 'daily_tr'
image of shell: but when I define sqoop action in ambary-views workflow manager with this command I have an error following: Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], main() threw exception, org/apache/hive/hcatalog/mapreduce/HCatOutputFormat
java.lang.NoClassDefFoundError: org/apache/hive/hcatalog/mapreduce/HCatOutputFormat
at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:432)
at org.apache.sqoop.manager.SQLServerManager.exportTable(SQLServerManager.java:192)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:81)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:171)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:153)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:75)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:50)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:231)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: java.lang.ClassNotFoundException: org.apache.hive.hcatalog.mapreduce.HCatOutputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 27 more
Image of ambary-views workflow manager: For solve this error I do the following: under folder in which workflow.xml is, i
create folder lib and put there all hive jar files from sharedlibDir(/user/oozie/share/lib/lib_201806281525405/hive My goal was to do that, components recognize
hcatalog jar files and classpath so I’m not sure for that, and maybe I shouln’t do that and do different solution for
this error Anyway after do that the error has been changed
following: Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], main() threw exception, org.apache.hadoop.hive.shims.HadoopShims.g
etUGIForConf(Lorg/apache/hadoop/conf/Configuration;)Lorg/apache/hadoop/security/UserGroupInformation;
java.lang.NoSuchMethodError: org.apache.hadoop.hive.shims.HadoopShims.getUGIForConf(Lorg/apache/hadoop/conf/Configuration;)Lorg/apache/hadoop/sec
urity/UserGroupInformation;
at org.apache.hive.hcatalog.common.HiveClientCache$HiveClientCacheKey.<init>(HiveClientCache.java:201)
at org.apache.hive.hcatalog.common.HiveClientCache$HiveClientCacheKey.fromHiveConf(HiveClientCache.java:207)
at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:138)
at org.apache.hive.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:564)
at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:104)
at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:85)
at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:63)
at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:349)
at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:433)
at org.apache.sqoop.manager.SQLServerManager.exportTable(SQLServerManager.java:192)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:81)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:171)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:153)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:75)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:50)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:231)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
my HDP version: please help me to solve
errors and issue and why the sqoop command work correctly in shell but in
ambari-views workflow manager has error?
... View more
Labels:
12-23-2018
05:08 PM
please explain about hive-site.xml where is hive-site.xml? and you copied this file to "/user/root/test/shell/" ? pls send an email to me " ahajitrorb@gmail.com "
... View more