Member since
11-14-2017
6
Posts
0
Kudos Received
0
Solutions
01-03-2018
04:02 PM
Hi @Benjamin Hopp Thanks for your response. yes License Jar is in same dir. Also to let you know for other environment(Stage) where security mechanism is not enabled on DB2 server it is working fine with out passing paramerter. But On Dev DB2 server they have enabled the security mechanism and its not working when i try with securityMechanism=13 option. I'm not sure if i'm passing it correctly and is that is correct way. ENCRYPTED_USER_PASSWORD_AND_DATA_SECURITY(13)
... View more
01-02-2018
09:49 PM
Hi, I have DB2 database in which secutiry mechanism was enabled and we are using Server authentication. I'm using the sqoop to establish the connection using JDBC with below command and getting error. error says Invalid database URL syntax. so i'm not sure how to pass the parameter. Can someone please suggest how to precisely pass the securityMechanism in the sqoop CLI. sqoop list-tables --connect jdbc:db2://xxxxxxxxm11.xxxx.com:50501/xxxmd11:securityMechanism=13 --username xxxxxx--password xxxxxxxxxxxx SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.2.0-205/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.2.0-205/accumulo/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/01/02 21:43:22 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.2.0-205
18/01/02 21:43:22 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/01/02 21:43:22 INFO manager.SqlManager: Using default fetchSize of 1000
18/01/02 21:43:23 ERROR manager.SqlManager: Error reading database metadata: com.ibm.db2.jcc.am.SqlSyntaxErrorException: [jcc][10165][10051][4.19.72] Invalid database URL syntax:jdbc:db2://xxxxxxxxm11.xxxx.com:50501/xxxmd11:securityMechanism=13. ERRORCODE=-4461, SQLSTATE=42815
com.ibm.db2.jcc.am.SqlSyntaxErrorException: [jcc][10165][10051][4.19.72] Invalid database URL syntax: jdbc:db2://xxxxxxxxm11.xxxx.com:50501/xxxmd11:securityMechanism=13. ERRORCODE=-4461, SQLSTATE=42815
at com.ibm.db2.jcc.am.b4.a(b4.java:747)
at com.ibm.db2.jcc.am.b4.a(b4.java:66)
at com.ibm.db2.jcc.am.b4.a(b4.java:93)
at com.ibm.db2.jcc.DB2Driver.tokenizeURLProperties(DB2Driver.java:948)
at com.ibm.db2.jcc.DB2Driver.connect(DB2Driver.java:413)
at com.ibm.db2.jcc.DB2Driver.connect(DB2Driver.java:112)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.listTables(SqlManager.java:539)
at org.apache.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
Caused by: java.util.NoSuchElementException
at java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
at java.util.StringTokenizer.nextToken(StringTokenizer.java:377)
at com.ibm.db2.jcc.DB2Driver.tokenizeURLProperties(DB2Driver.java:936)
... 14 more
Could not retrieve tables list from server
... View more
Labels:
- Labels:
-
Apache NiFi
-
Apache Sqoop
11-17-2017
03:35 PM
Hi @slachterman To let you know, with current HDP setup I have, i'm able to lift/connect to Azure Datalake using hdfs command. But using NiFi it is not working as you mentioned i'm using PutHdfs and GenerateFlowFile processors. I'm getting below error. From the error we can make out permission issue, But I have provided full rights/permission to whole world on the ADLS i'm using. Can you please suggest and help me resolve this issue.
Failed to write to HDFS due to org.apache.hadoop.security.AccessControlException Permission denied: user=root, access=WRITE, Inode="/hdp-lake":hdfs:hdfs:drwx-r-xr-x at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) Thanks in advance for your help!.
... View more
11-14-2017
04:53 PM
Hi Team, We have installed HDP on our Azure VM and looking to integrate Azure Datalake store. I know this can be done using Cloudbreak and with the help of Hortonworks documents below. http://hortonworks.github.io/cloudbreak-docs/latest/onprem/ http://hortonworks.github.io/cloudbreak-docs/latest/azure/ But took approach of manually installing the cloud break deployer and I want exact Azure platform-specific single binaries to deploy the cloud break. I Don't have info on that and not able to find and get them. So can some one help in providing the details. Thanks in advance. regards Mahesh.
... View more
Labels:
- Labels:
-
Hortonworks Cloudbreak
11-14-2017
04:03 PM
Hi slachterman, Can you please confirm if this scenario works when Nifi is local to HDI Cluster which is configured to Datalake? More details on the architecture. I was trying with below steps and didn't work. 1) We have HDP & NiFi Installed on Azure VM. 2) using above NiFi trying to connect to azure HDInsight (another VM) which can talk to datalake. 3) All VM's are on same Azure VNET and subscription. 4) when I followed the instructions and tested files are writing to the HDFS of HDP cluster but not to Datalake of HDI cluster. Can you please let me know exactly how to write files to Azure Datalake? Thanks in advance. Regards Mahesh.
... View more