Member since
05-02-2017
360
Posts
65
Kudos Received
22
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
13488 | 02-20-2018 12:33 PM | |
1531 | 02-19-2018 05:12 AM | |
1889 | 12-28-2017 06:13 AM | |
7187 | 09-28-2017 09:25 AM | |
12241 | 09-25-2017 11:19 AM |
05-06-2017
08:54 AM
1 Kudo
Hi @Raj B Are you able to view the base file available in HDFS of the hive ORC table? If you are able to see that then create a new table with same structure with TEXTFILE as format then copy the HDFS file from old table to new table. Just copy the base file and not via hive table. It should work good for you !
... View more
05-05-2017
07:06 AM
@Vinay R Glad it helped you. If you think it solves your problem then please accept the answer.
... View more
05-05-2017
06:19 AM
3 Kudos
Hi @Vinay R What is the format that table1 in hive? Is it stored in any other format apart from TEXTFILE? If so the data might be compressed if you have enabled the compression in your table property. Execute this and try exporting it. set hive.exec.compress.output=false;
... View more
05-04-2017
07:33 AM
Hi @Atul Goel Could you please share the table DDL?
... View more
05-02-2017
08:05 PM
1 Kudo
Hi @Ishvari Dhimmar As stated by Ervits after flattening the pig output to remove bags if there are any use the file to load into a hive table. For loading into a hive table use " load data inpath <pig out output file> into table <hive table name>". Ensure that the format of the pig output file is compatible with hive native formats. If you do so then you will achieve what you are looking for.
... View more
05-02-2017
07:55 PM
Hi @Raphaël MARY If you are using Greenplum then there is an existing protocol which will take care of your use case. gphdfs protocol. Its simple and easy but it will support only TEXT and CSV as of now. https://discuss.pivotal.io/hc/en-us/articles/202635496-How-to-access-HDFS-data-via-GPDB-external-table-with-gphdfs-protocol Check the above link for gphdfs protocol.
... View more
05-02-2017
07:51 PM
Hi @rajdip chaudhuri Update is supported only for hive table in which ACID property is enabled. Also by seeing your querying im afraid as I hive will not support such use case as of now. However once MERGE statement is added you will be able to update using join. Check on this Jira ticket . https://issues.apache.org/jira/browse/HIVE-10924 If you use your existing query it will fail. Alternatively load the data after performing join into a temp table and then update the target based on your temp table.
... View more
04-28-2017
09:20 PM
@Wynner Thanks. it seems the problem is im not running nifi as adminstrator. It solved my issue. Im able to start NiFi through cmd. But again one more problem begins. Attached the screenshot appreciate your help.
... View more
04-28-2017
10:26 AM
@Wynner When i try to run-Nifi in c drive it works fine. But If copy Nifi and its related file to d drive and run-Nifi from that that directory then it throws an error.
... View more