Member since
12-28-2018
10
Posts
0
Kudos Received
0
Solutions
02-07-2019
06:39 PM
Hi I find the solution, now there is no duplicate data occurred. Thank you
sqoop import --connect 'jdbc:sqlserver://XX.XXX;database=XXXX' --username XX --password XXXX --table ActionItems --split-by ActionId --target-dir /user/hive/warehouse/project_tracking.db/actionitems --incremental append --check-column Lastupdateddate --last-value "2019-02-06 19:52:55.873"
... View more
02-04-2019
02:25 PM
Hi all, When i tried to do incremental load from sql to hive i'm getting duplicate data. I have only 2 rows in Sql, if i run this code again i got another 2 rows in Hive i.e,. duplicate occurred. Please provide me solutions. sqoop import --connect 'jdbc:sqlserver://XXX.XXX;database=PTWTarget' --username un --password Admin --table ActionItems --incremental lastmodified --check-column Lastupdateddate --merge-key ActionId --num-mappers 1 --hive-import --hive-table project_tracking.actionitems2
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache Sqoop
01-31-2019
07:02 PM
sqoop import-all-tables --driver=com.microsoft.sqlserver.jdbc.SQLServerDriver --connect jdbc:sqlserver://XX.XX.XXX:1433; database=PTWTarget --username ML_Testing --password Admin --hive-database PTWTarget --hive-import I tried like this but i'm getting error, please validate my code and provide me solution Error: java.lang.RuntimeException: java.lang.RuntimeException:
com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection
to the host x.x.x.x, port 1433 has failed. Error: "connect timed out.
Verify the connection properties. Make sure that an instance of SQL
Server is running on the host and accepting TCP/IP connections at the
port. Make sure that TCP connections to the port are not blocked by a
firewall.".
... View more
01-31-2019
12:19 PM
Hi I have a Query like this, sqoop import
--driver=com.microsoft.sqlserver.jdbc.SQLServerDriver --connect
“jdbc:sqlserver://XX.XX.XXX:1433; database=PTWTarget” --username
sa --P --table ActionItems --create-hive-tables --target-dir
/sqltohive/actionitems --hive-import -m 1 when i execute, i'm getting error in TCP/IP connection had refused.
... View more
01-30-2019
04:16 PM
Hi Please Provide the solution for Importing all tables from SQL to hive
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
01-28-2019
09:40 PM
Hi Thank you for your reply I tried this method to insert csv data into hbase table that's working fine. My question is, i have a list of flat files i.e,.word, excel, image in my hdfs directory, i want to store all these data into one hbase table as a object. still i didn't get solution for this problem, please provide any suggestions for me. Thank you
... View more
01-28-2019
10:47 AM
I have these kind of unstructured data stored in HDFS,
Please provide the suggestion to load these unstructured data into HBase
table to view the data.
... View more
01-25-2019
05:18 PM
Hi I have list of document files in HDFS, it contains .csv, excel, image, pdf,etc., I want to load these these documents into HBase table. Please provide the suggestions
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HBase