Member since
10-20-2018
9
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
27601 | 02-22-2019 04:37 PM |
02-22-2019
04:37 PM
1 Kudo
Yes, try ORACLE TO_TIMESTAMP() format if needed
... View more
02-22-2019
02:10 PM
Can you try doing cast(col4 as timestamp) in your sqoop query. Not sure if it would work. But give a try
... View more
02-22-2019
02:07 PM
Requirement is to create a parquet file and send it to downstream. Issue is that parquet file is always getting created as empty file even when there is data in "select * from prov_cust" table.
... View more
02-03-2019
05:22 PM
Hi Team, Am trying to create a PARQUET file through the below INSERT OVERWRITE DIRECTORY '/data/dev1/consumption/crm' STORED AS PARQUET SELECT * FROM PROV_CUST; However, I could see that the file is always empty even when there are data in the table. Could someone please assist Thanks
... View more
Labels:
- Labels:
-
Apache Spark
-
HDFS
01-03-2019
03:36 PM
Thanks, That worked like a charm
... View more
01-02-2019
12:20 PM
Sqoop import fails on empty table
We have a batch job that looks into Teradata table and offloads delta records to HIVE based on a timestamp column.
We havent used SQOOP incremental mode, Instead we have setup views on top of teradata tables and a joined to a control table to get the delta records.
Once the offload is completed, We update the control table with MAX of timestamp record.
However, We are stuck with below issue: Import failed: com.teradata.connector.common.exception.ConnectorException: Input source table is empty
Could someone assist on the same?
Regards, SV
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop