Member since
02-01-2016
71
Posts
36
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3445 | 06-27-2019 10:09 AM | |
1494 | 01-27-2017 05:22 AM | |
1703 | 01-06-2017 05:05 AM | |
2499 | 11-17-2016 05:37 AM | |
2936 | 03-03-2016 12:28 PM |
02-10-2016
09:19 AM
1 Kudo
Unable to query hive tables... was able to retrive the schema of tables, couldn't get the data of the table?
... View more
Labels:
- Labels:
-
Apache HCatalog
-
Apache Hive
02-10-2016
06:44 AM
1 Kudo
sqoop free form query import to hbase having issue ... while using multiple mappers its importing multiple times sqoop import --connect "jdbc:sqlserver://;database=;username=;password=" --query 'select top 100000 * from where $CONDITIONS' --split-by ID --hbase-table --column-family info --hbase-create-table -m 4 ---- this import query is importing 400000 in place of 10000
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HBase
-
Apache Sqoop
02-09-2016
12:26 PM
1 Kudo
@Predrag Minovic seems fine but we are currently using ORC format of hive tables and for external managed tables is it possible for the same format ?
... View more
02-09-2016
11:43 AM
1 Kudo
@Geoffrey Shelton Okot I am having issue while importing to Hive table and to HDFS it is happening perfectly fine
... View more
02-09-2016
11:42 AM
1 Kudo
@Geoffrey Shelton Okot
sqoop import --connect "jdbc:sqlserver://xxxxx:1433;database=xxxxxx;username=xxxxxx;password=xxxxx" --table <table> --hive-import --hive-database <DBname> --incremental append --check-column <name> --last-value 3 -m 1
... View more
02-09-2016
10:18 AM
1 Kudo
Have been trying to do incremental import to a hive table using sqoop .. but unfortunately showing as Append mode for hive imports is not yet supported. Please remove the parameter --append-mode
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
02-01-2016
12:37 PM
1 Kudo
@Neeraj SabharwalSource System is MSSQL Server
... View more
02-01-2016
12:33 PM
1 Kudo
Hi Neeraj, Source System is MSSQL Server
... View more
02-01-2016
12:32 PM
1 Kudo
Hi Benjamin, Thank you for the inputs, we are looking for a lambda architecture wherein we would pull the data from RDBMS into kafka and from there for batch processing we would use spark and for streaming we want to use storm.
Currently, we are using sqoop to import data from RDBMS to Hive/Hbase. But in future we wants to implement Kafka to work as the data ingestion tool. Yeah, I have been going through a lot of forums lately about kafka but i have never read about any ingestion from DB. Also, Can we integrate sqoop and Kafka to work together. Implementing incremental import from RDBMS using sqoop to kafka and providing the same to spark for batch processing and updating to Hive Tables from there
... View more
02-01-2016
09:19 AM
2 Kudos
Hi,Currently we are implementing a POC in which we require to import data from RDBMS. Previously, we used sqoop to do the same and it was working fine. Currently, need to pull data using kafka for real time processing. How to implement the same
... View more
Labels:
- Labels:
-
Apache Kafka
- « Previous
- Next »