Member since
05-16-2016
785
Posts
114
Kudos Received
39
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2327 | 06-12-2019 09:27 AM | |
| 3574 | 05-27-2019 08:29 AM | |
| 5724 | 05-27-2018 08:49 AM | |
| 5239 | 05-05-2018 10:47 PM | |
| 3113 | 05-05-2018 07:32 AM |
05-15-2018
10:53 AM
[cloudera@quickstart ~]$ ^C [cloudera@quickstart ~]$ sqoop import --connect jdbc:mysql://192.168.186.128:3306/myfirsttutorial --username cloudera --password cloudera --table mytable --target-dir=/user/cloudera/myfirst-m Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 18/05/15 10:47:00 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0 18/05/15 10:47:00 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 18/05/15 10:47:00 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 18/05/15 10:47:00 INFO tool.CodeGenTool: Beginning code generation 18/05/15 10:47:01 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Access denied for user 'cloudera'@'quickstart.cloudera' (using password: YES) java.sql.SQLException: Access denied for user 'cloudera'@'quickstart.cloudera' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:996) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:870) at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4332) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1258) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2234) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2265) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2064) at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:790) at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:395) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:215) at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904) at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52) at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763) at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786) at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289) at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260) at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246) at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327) at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1858) at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1657) at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621) at org.apache.sqoop.Sqoop.run(Sqoop.java:147) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243) at org.apache.sqoop.Sqoop.main(Sqoop.java:252) 18/05/15 10:47:01 ERROR tool.ImportTool: Import failed: java.io.IOException: No columns to generate for ClassWriter at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1663) at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621) at org.apache.sqoop.Sqoop.run(Sqoop.java:147) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243) at org.apache.sqoop.Sqoop.main(Sqoop.java:252) [cloudera@quickstart ~]$ ^C [cloudera@quickstart ~]$ I have just started cloudera i am stuck here can anybody hep
... View more
05-09-2018
07:55 AM
Could you please mark it as answered so the community will be benifited ?
... View more
05-08-2018
01:13 AM
Do not specify driver in sqoop arguments. Using the –driver parameter will always force Sqoop to use the Generic JDBC Connector regardless of if a more specialized connector is available. For example, if the MySQL specialized connector would be used because the URL starts with jdbc:mysql://, specifying the –driver option will force Sqoop to use the generic connector instead. As a result, in most cases, you should not need to use the –driver option at all. Thanks, Ankit Gaurav Sinha
... View more
05-05-2018
11:46 PM
Cloudera Manager and Cloudera Director are properties of Cloudera and does not fall under Apache License. You can check the for CM CM > Support > About Copyright © 2011-2018 Cloudera, Inc. All rights reserved.
Hadoop and the Hadoop elephant logo are trademarks of the Apache Software Foundation. For Director Director > Help ? (icon) > About Copyright © 2014-2018 Cloudera, Inc. All rights reserved.
... View more
05-05-2018
07:32 AM
2 Kudos
Not all the connectors are supported and mostly its a two step process to transfer data from RDBMS to hive / hbase .In my poc testing performance compared to sqoop2 is not up to the level of sqoop 1 .So i would prefer sqoop 1 more over sqoop 2 is deprecated .
... View more
05-03-2018
01:33 AM
Thank you for your support
... View more
05-02-2018
09:28 AM
are you using beeline client tool ? did you try increasing heap on the below property HADOOP_CLIENT_OPTS just curious to know below following what file format are you using ? is there any compression ? is table stats being collected ? is table being partitioned or buckted ?
... View more
05-01-2018
10:04 AM
yes @csguna I thinks thats the Issue. @Annkaa you can verify that part by running a small query in your Hive shell and Beeline shell separately. In your case the query should work in Hive and Beeline should promt "No current connection". As a matter of fact every external connectivity to hive metastore goes through beeline. That must be the issue as you are unable to accomplish it through Hue.
... View more
04-30-2018
04:10 AM
@bhaveshsharma03 In fact there is no standard answer for this question as it is purly based on your business model, cluster size, sqoop export/import frequency, data volume, hardware capacity, etc I can give few points based my experience, hope it may help you 1. 75% of the sqoop scripts (non-priority) will use the default mappers for various reasons as we don't want to use all the available resources for just sqoop alone. 2. Also we don't want to apply all the possible performance tuning methods on those non-priority jobs, as it may disturb the RDBMS (source/target) too. 3. Get in touch with RDBMS owner to see their non-busy hours, identify the priority sqoop scripts (based on your business model), apply the performance tuning methods on the priroity scripts based on data volume (not only rows, 100s of column also matters). Repeat it if you have more than one Databases. 4. Regarding who is responsible... in most of the cases, If you have small cluster being used by very few teams, then developers and admin can work together but if you have a very large cluster being used by so many teams, then it is out of admin's scope.... again it depends
... View more
04-24-2018
05:41 AM
Hi, 1- yes you can do it, like I tell you.. create a external text table on Impala directly then create a parquet table and select from the text one.. (the converting will be done automatically..). 2- I think you can.. try to search about parquet-tools. Good luck.
... View more