Member since
05-16-2017
10
Posts
1
Kudos Received
0
Solutions
08-03-2017
01:59 AM
Hi Guys, If you can help me with this issue. @jsensharma @mclark @tspann
... View more
08-02-2017
04:14 PM
2017-08-02 23:56:47.0059 DEBUG [main] (Cluster.java:90) [org.apache.hadoop.mapreduce.Cluster] - Trying ClientProtocolProvider : org.apache.hadoop.mapred.LocalClientProtocolProvider
2017-08-02 23:56:47.0060 INFO [main] (JvmMetrics.java:71) [org.apache.hadoop.metrics.jvm.JvmMetrics] - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-08-02 23:56:47.0060 DEBUG [main] (Cluster.java:103) [org.apache.hadoop.mapreduce.Cluster] - Picked org.apache.hadoop.mapred.LocalClientProtocolProvider as the ClientProtocolProvider cause:org.apache.hive.hcatalog.common.HCatException : 2004 : HCatOutputFormat not initialized, setOutput has to be called
2017-08-02 23:56:47.0061 DEBUG [main] (ClassLoaderStack.java:45) [org.apache.sqoop.util.ClassLoaderStack] - Restoring classloader: sun.misc.Launcher$AppClassLoader@70dea4e
2017-08-02 23:56:47.0062 ERROR [main] (ImportTool.java:613) [org.apache.sqoop.tool.ImportTool] - Encountered IOException running import job: org.apache.hive.hcatalog.common.HCatException : 2004 : HCatOutputFormat not initialized, setOutput has to be called
at org.apache.hive.hcatalog.mapreduce.HCatBaseOutputFormat.getJobInfo(HCatBaseOutputFormat.java:102)
at org.apache.hive.hcatalog.mapreduce.HCatBaseOutputFormat.getOutputFormat(HCatBaseOutputFormat.java:76)
at org.apache.hive.hcatalog.mapreduce.HCatBaseOutputFormat.checkOutputSpecs(HCatBaseOutputFormat.java:65)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)
at org.apache.sqoop.manager.SqlManager.importQuery(SqlManager.java:729)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:499)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at master.invoker.sqoopLoader$.executeSqoopJob(sqoopLoader.scala:344)
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Sqoop
-
Apache YARN
07-31-2017
02:07 AM
When doing a hadoop distcp command from source to target, Is it possible to check the resource Utilization in both the source and target cluster.
... View more
Labels:
- Labels:
-
Apache Hadoop
07-14-2017
03:02 AM
I didn't. I just passed the url, username, password and the table name val args: Array[String] = Array("--connect", url, "--username", user, "--password", password, "--table", "<tablename>")
... View more
07-13-2017
09:02 AM
I am using HDP 2.4 and created a SQOOP API via scala code. Now, when I am running the sqoop command it is running fine with data being loaded in the HDFS but whenever I use the Java API I am getting this error: Exception in thread "main" java.lang.RuntimeException: Could not load db driver class: oracle.jdbc.OracleDriver
at org.apache.sqoop.manager.OracleManager.makeConnection(OracleManager.java:286)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:744)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:767)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:270)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1845)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at master.invoker.tester$.main(tester.scala:158)
at master.invoker.tester.main(tester.scala) It is like when I am running with the API, It is unable to find the lib of SQOOP. This is the Scala code used to run Sqoop in Java API. var url="jdbc:oracle:thin:@//<IP>:<PORT>/<SID>" var user="xxx" var password="xxx" val tool:SqoopTool = new ImportTool() val args: Array[String] = Array("--connect", url, "--username", user, "--password", password, "--table", "<tablename>") var options:SqoopOptions = new SqoopOptions(); try { options = tool.parseArguments(args, null, options, false);
tool.validateOptions(options); } catch { case e: Exception => {
System.err.println(e.getMessage()); } } tool.run(options)
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Sqoop
05-16-2017
02:45 PM
FAILED: SemanticException Cannot find class 'org.apache.hadoop.hive.hbase.HbaseStorageHandler'
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Hive
05-16-2017
02:37 PM
1 Kudo
FAILED: SemanticException Cannot find class 'org.apache.hadoop.hive.hbase.HbaseStorageHandl
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Hive