Member since
07-01-2016
38
Posts
11
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1369 | 09-21-2016 12:23 AM | |
1557 | 09-16-2016 01:10 PM | |
1560 | 09-04-2016 05:47 PM | |
2467 | 08-08-2016 01:44 AM | |
1202 | 07-18-2016 12:09 AM |
09-21-2016
12:23 AM
Hi, I further researched and found that in Ranger there is a Kafka plug-in which is not enabled. I enabled Kafka plugin and restarted the services. Once I restart it, the SQOOP job import worked fine. including Atlas hooks. Thanks ram
... View more
03-14-2017
03:55 AM
I have found a solution to this provided by another user here: https://community.hortonworks.com/questions/20719/sqoop-to-sql-server-with-integrated-security.html Basically if you switch to the jtds driver which you can download here: http://jtds.sourceforge.net/ Per Rajendra Manjunath " Sqoop SQL Server data import to HDFS worked with manual parametric the authentication(using windows credential) with added parameter on the SQL Server JDBC driver, as integrated security is not supported by the SQL driver as of now due to the Kerberos authentication(Delegated tokens distributed over cluster while running MR job). So we need to pass the windows authentication with password and with the integrated security disabled mode to import the data to the system. As normal SQL server driver does not support, so I had used the jtds.jar and the different driver class to pull the data to the Hadoop Lake. Sample Command I tried on the server as follows, sqoop import --table Table1 --connect "jdbc:jtds:sqlserver://<Hostname>:<Port>;useNTLMv2=true;domain=<WindowsDomainName>;databaseName=XXXXXXXXXXXXX" \ --connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username XXXXX --password 'XXXXXXX' \ --verbose --target-dir /tmp/33 -m 1 -- --schema dbo " Here are some examples that worked for me: # List databases sqoop list-databases --connect "jdbc:jtds:myactivedirectorydomain.com" --connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username XXXXX -P # List tables sqoop list-tables --connect "jdbc:jtds:myactivedirectorydomain.com;databaseName=DATABASENAMEHERE" --connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username jmiller.admin -P # Pull data example sqoop import --table TABLENAMEHERE --connect "jdbc:jtds:myactivedirectorydomain.com;databaseName=DATABASENAMEHERE" --connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username XXXXX -P --fields-terminated-by '\001' --target-dir /user/XXXXX/20170313 -m 1 -- --schema dbo Note* In the above example you need to change the username to your username and database name in the list-tables or pull to the one you need (note the AD account you use will require access to the data).
... View more
09-04-2016
05:47 PM
I was able to identify my mistake. I forgot one step i.e ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar Once I executed that command, it started working. The OS is centos 7.2. Thank you for your help. Thanks Ram
... View more
01-25-2017
05:28 PM
I don't have any data in the cluster and hence it was easy for me to remove all bits from the nodes and did fresh install with 2.4. But if you have data in the cluster it may be better to proceed with cluster upgrade steps and verify them as you have already upgraded the Ambari to 2.4. Thanks Ram
... View more
08-08-2016
01:44 AM
Hi All, I would like post the solution that worked for me. I deleted data from the following tables from Ambari database. a) request b) stage c) host_role_command d) execution_command e) requestoperationlevel f) requestresourcefilter Thank you for your help Thanks Ram
... View more
07-24-2017
12:06 PM
I suggest to read you this topic. Might be helpful.
... View more
07-18-2016
12:09 AM
Hi All, I further researched this issue and found an alternative solution. if you define the function as follows val parsePatientfun = udf { (thestruct: Row) =>
thestruct.getAs[String]("City") } you can get to the fields from StuctureType. Thanks Ram
... View more