Member since
07-21-2016
101
Posts
10
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3804 | 02-15-2020 05:19 PM | |
69099 | 10-02-2017 08:22 PM | |
1493 | 09-28-2017 01:55 PM | |
1708 | 07-25-2016 04:09 PM |
08-13-2018
07:30 PM
Hi, I need to access HIVESERVER2 from my laptop through beeline client. How do I install beeline client ? Mine is Mac OS. i tried "brew install beeline", looks like thats not right. Let me know if someone has same setup. Thanks Kumar
... View more
Labels:
- Labels:
-
Apache Hive
07-13-2018
06:47 PM
@Vinicus Higa Murakami...it worked I just had to do one more extra step....IN my case I converted my *.crt file into *.pem file and then I used this *.pem file to generate jks. Thanks Kumar
... View more
07-09-2018
05:03 PM
The goal here is I need to import data from MS SQL Server database to HDFS. The connectivity between Hadoop Cluster and MS SQL Server works fine. I confirmed this by TELNETing to port 1433. I am also able to --list-tables. [root@api1.dev ~]# sudo -u XXXXXXX /usr/hdp/current/sqoop-client/bin/sqoop list-tables --connect "jdbc:sqlserver://XX.XX.XXX.XXX:1433;database=XXXXXXXXXX;username=XXXXXXXX;password=XXXXXXXX"
Warning: /usr/hdp/2.6.4.0-91/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/hdp/2.6.4.0-91/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/07/09 16:44:43 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.4.0-91
18/07/09 16:44:43 INFO manager.SqlManager: Using default fetchSize of 1000
XXXXXXXXXXXXXX<the table name >
DBAs have enabled SSL Encryption on the database side and they have shared the SSL Cert asking us to use when we pull the data out of database. I did go through this link https://docs.microsoft.com/en-us/sql/connect/jdbc/connecting-with-ssl-encryption?view=sql-server-2017 on JDBC Documentation. and here is the command that I have arrived [root@api1.dev ~]# sudo -u XXXXXXXX /usr/hdp/current/sqoop-client/bin/sqoop import --connect "jdbc:sqlserver://XX.XXX.XXXX.XXX:1433;database=XXXXXXXXXX;username=XXXXXXXXXX;password=XXXXXXXX;encrypt=true;trustServerCertificate=false;trustStore=/etc/pki/CA/certs/XXXXXXXXXXXXX.crt" --table XXXXXXXXXX --fields-terminated-by , --escaped-by \\ --enclosed-by '"' --compress -m 1 --target-dir /user/XXXXXXXXXXXX/ --append --hive-drop-import-delims -- --schema dbo --table-hints NOLOCK Here is the exception that I get INFO: java.security path: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.161-0.b14.el7_4.x86_64/jre/lib/security
Security providers: [SUN version 1.8, SunRsaSign version 1.8, SunEC version 1.8, SunJSSE version 1.8, SunJCE version 1.8, SunJGSS version 1.8, SunSASL version 1.8, XMLDSig version 1.8, SunPCSC version 1.8]
KeyStore provider info: SUN (DSA key/parameter generation; DSA signing; SHA-1, MD5 digests; SecureRandom; X.509 certificates; JKS & DKS keystores; PKIX CertPathValidator; PKIX CertPathBuilder; LDAP, Collection CertStores, JavaPolicy Policy; JavaLoginConfig Configuration)
java.ext.dirs: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.161-0.b14.el7_4.x86_64/jre/lib/ext:/usr/java/packages/lib/ext
18/07/09 16:50:26 ERROR manager.SqlManager: Error executing statement: com.microsoft.sqlserver.jdbc.SQLServerException: The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "Invalid keystore format". ClientConnectionId:daf3f972-6029-4629-8817-7bb8ac260c5c
com.microsoft.sqlserver.jdbc.SQLServerException: The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "Invalid keystore format". ClientConnectionId:daf3f972-6029-4629-8817-7bb8ac260c5c
at com.microsoft.sqlserver.jdbc.SQLServerConnection.terminate(SQLServerConnection.java:1667)
at com.microsoft.sqlserver.jdbc.TDSChannel.enableSSL(IOBuffer.java:1668)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:1323)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:991)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:827)
at com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:1012)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:270)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:902)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:328)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1853)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1653)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
Caused by: java.io.IOException: Invalid keystore format
at sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:658)
at sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeyStore.java:56)
at sun.security.provider.KeyStoreDelegator.engineLoad(KeyStoreDelegator.java:224)
at sun.security.provider.JavaKeyStore$DualFormatJKS.engineLoad(JavaKeyStore.java:70)
at java.security.KeyStore.load(KeyStore.java:1445)
at com.microsoft.sqlserver.jdbc.TDSChannel.enableSSL(IOBuffer.java:1525)
... 25 more
After few more reading it is said that the key needs to be converted into jks format Has anybody been in this situation ?
... View more
Labels:
- Labels:
-
Apache Sqoop
06-26-2018
09:51 PM
I am running a simple sqoop import command which in turn runs a map reduce. I am testing this after standing up the cluster...I see this error from the logs : [2018-06-26 21:02:59,672] {bash_operator.py:76} INFO - main : requested yarn user is ods_archive
[2018-06-26 21:02:59,673] {bash_operator.py:76} INFO - Path /disk1/hadoop/yarn/local/usercache/ods_archive/appcache/application_1530031055103_0009 has permission 700 but needs permission 750.
[2018-06-26 21:02:59,673] {bash_operator.py:76} INFO - Path /disk2/hadoop/yarn/local/usercache/ods_archive/appcache/application_1530031055103_0009 has permission 700 but needs permission 750.
[2018-06-26 21:02:59,673] {bash_operator.py:76} INFO - Path /disk3/hadoop/yarn/local/usercache/ods_archive/appcache/application_1530031055103_0009 has permission 700 but needs permission 750.
[2018-06-26 21:02:59,674] {bash_operator.py:76} INFO - Path /disk4/hadoop/yarn/local/usercache/ods_archive/appcache/application_1530031055103_0009 has permission 700 but needs permission 750.
[2018-06-26 21:02:59,674] {bash_operator.py:76} INFO - Path /disk5/hadoop/yarn/local/usercache/ods_
archive/appcache/application_1530031055103_0009 has permission 700 but needs permission 750.
[2018-06-26 21:02:59,674] {bash_operator.py:76} INFO - Path /disk6/hadoop/yarn/local/usercache/ods_archive/appcache/application_1530031055103_0009 has permission 700 but needs permission 750.
After checking the file system I see the directories do not have execute permissions :
[root@node3.dev appcache]# pwd
/disk1/hadoop/yarn/local/usercache/ods_archive/appcache
drwx--S--- 4 ods_archive hadoop 4096 Jun 26 19:20 application_1530031055103_0005
drwx--S--- 4 ods_archive hadoop 4096 Jun 26 19:26 application_1530031055103_0006
drwx--S--- 4 ods_archive hadoop 4096 Jun 26 19:38 application_1530031055103_0008
drwx--S--- 4 ods_archive hadoop 4096 Jun 26 21:04 application_1530031055103_0009
drwx--S--- 4 ods_archive hadoop 4096 Jun 26 21:10 application_1530031055103_0010
drwx--S--- 4 ods_archive hadoop 4096 Jun 26 21:16 application_1530031055103_0011
drwx--S--- 4 ods_archive hadoop 4096 Jun 26 21:23 application_1530031055103_0012
Am I missing somewhere to set the permissions ?
... View more
Labels:
- Labels:
-
Apache YARN
06-13-2018
06:29 PM
Team, I build a new cluster and we have jobs to pull data out of MS SQL Server. MS SQL Server listens on port 1433 and our Network Security team has denied to open firewall between our Hadoop Cluster and MS SQL Server saying that port 1433 is a non secure port. MS SQL DBAs said that they cannot enable SSL on the DB side because other applications(legacy) would not be able to connect to MS SQL Server. Now from hadoop side we need to ensure our connections are secure. Has anybody faced this situation ? thanks Kumar
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Sqoop
04-24-2018
03:30 PM
@Jay Kumar Sensharma Thanks for your reply. Looks like there were stale alerts. All the alerts went away after I restarted the ambari agents
... View more
04-24-2018
02:09 PM
The nodes in my cluster does not have direct access to Internet and so we were given proxies. I am referring to the document: https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.0.0/bk_ambari-administration/content/ch_setting_up_an_internet_proxy_server_for_ambari.html I am trying to add no_proxies for few nodes, but as soon as add more than one node with pipe delimmitedf, ambari server does not restarts. Has anybody faced this situation ?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
04-24-2018
01:40 PM
I have installed a brand new cluster and all the services are up and running. But Ambari shows alert for all Web UIs. Here are few example : DataNode web UI Connection failed to http://node1.dev.XXXXXXXX.io:50075 (timed out) Resource Manager Web UI Connection failed to http://api1.dev.XXXXXXX.io:8088 (timed out) Anybody has any idea on which logs Do I need to check to get more details..?
... View more
Labels:
- Labels:
-
Apache Hadoop
04-17-2018
05:06 PM
In my PROD environment, Infrastructure team is going to patch Top of the Rack switches and I came to know we do not have HA enabled( from switch side). My understanding is my cluster would not function with switch going down. Am I correct ? Also, what are the services I need to stop? Thanks Kumar
... View more
Labels:
- Labels:
-
Apache Hadoop