Member since
10-01-2015
3933
Posts
1150
Kudos Received
374
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 3655 | 05-03-2017 05:13 PM | |
| 3014 | 05-02-2017 08:38 AM | |
| 3275 | 05-02-2017 08:13 AM | |
| 3220 | 04-10-2017 10:51 PM | |
| 1684 | 03-28-2017 02:27 AM |
02-19-2016
01:25 PM
questions is re: Nifi not Oozie
... View more
02-19-2016
01:24 PM
@Shishir Saxena you can wrap the sqoop command in shell script and use ExecuteProcess if you'd like, would love to see it as an article on HCC when you get it done.
... View more
02-19-2016
01:21 PM
@rajdip chaudhuri give it right permissions, and you can also specify it explicitly you can use Sqoop with any other JDBC-compliant database. First, download the appropriate JDBC driver for the type of database you want to import, and install the .jar file in the $SQOOP_HOME/lib directory on your client machine. (This will be /usr/lib/sqoop/lib if you installed from an RPM or Debian package.) Each driver .jar file also has a specific driver class which defines the entry-point to the driver. For example, MySQL’s Connector/J library has a driver class of com.mysql.jdbc.Driver . Refer to your database vendor-specific documentation to determine the main driver class. This class must be provided as an argument to Sqoop with --driver .
... View more
02-19-2016
01:19 PM
@sasmita panigrahi take a look in /var/log/hive on the server that has your hive server installation. The logs you'd sent are not valid.
... View more
02-19-2016
01:09 PM
1 Kudo
place the mysql-connector-java.jar into /usr/hdp/sqoop-client/lib @rajdip chaudhuri review the docs https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_dataintegration/content/ch_using-sqoop.html
... View more
02-19-2016
11:42 AM
Please paste the XML file snippet where you pasted the code it requested. @Lubin Lemarchand you probably didn't close a tag or outside the tags
... View more
02-19-2016
11:35 AM
@Steven Cardella try using the avro-tools utility to generate schema file. It has a lot of helpful utilities. I remember messing alot with schema before it accepted but avro tools jar is essential
... View more
02-19-2016
11:29 AM
1 Kudo
@Shishir Saxena execute process is intended for Linux commands, nifi is not a scheduling tool, you still need to use Oozie. Especially if you want to maintain order of your transactions. Oozie is still the solution for that with sqoop job maintaining the last record processed in the sqoop metastore
... View more
02-19-2016
11:24 AM
@R M is the Hive table also an hcatalog table? Do describe on test_table before running dump usually that will give you schema but you're most likely not loading correctly
... View more
02-19-2016
11:17 AM
@Prakash Punj you're most likely encountering this problem please see https://community.hortonworks.com/questions/8518/hcatloader-error.html There's a known bug in sandbox
... View more