Member since
12-21-2016
83
Posts
5
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
42064 | 02-08-2017 05:56 AM | |
5902 | 01-02-2017 11:05 PM |
05-17-2017
06:22 AM
Can you eloberate more details as i am facing same issue and when i checked it i see the java-json.jar in the oozie shared lib path, However, i don't see it in the sqoop-client/lib path on the gateway.
... View more
04-27-2017
06:24 PM
Replication is for the data-node failure, when Human deletes the data, data will lost where-ever it resides be it on any number of nodes. and this is moved into trash and if needed we can get it back within certain interval time.
... View more
04-25-2017
06:07 PM
Thanks and Yes, i can re-write it, however i am looking options if there is any way to get it back and when i drop the table,immediately the commit will occur to the meta-store, which might be causing for not recovering the hive table schema back. Any other alternatives options ?
... View more
04-25-2017
05:31 PM
I have a Hive external table and Unfortunately, the schema of the table got dropped and i want to get back the schema. Is there any ways to get it back ? I do understand that Hdfs is a file system, However, try to see if there are any possibilities.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
02-16-2017
03:15 AM
I found a solution to export this kind of data to any RDBS in the form of UTF8 or any other character set by giving the specific character set after the database/host name.
... View more
02-15-2017
11:40 PM
Yes, It is displaying the special characters with good reading format after adding serilization encoding property, however,while i am exporting the data to teradata with sqoop statement as using a connection manager i getting as non-readable characters in teradata. Attached is the screen shot(teradat.png). I suspect sqoop is not reconizing the special chracters correctly or do i need to use any specific teradata jar's while exporting the data ? I have attached the ingested data(after-ingestion-data-into-hadoop.png) and the showed the data in hive after adding encoding property(after-adding-encoding-to-hive-table.png), where as the same data is not same in Teradata. I would like to see the same type of characters in teradata as-well. Any Help appreciated. )
... View more
02-14-2017
07:47 PM
I have requirement to handle file which contains special characters (like trademarks, non-utf and so on..)
... View more
Labels:
- Labels:
-
Apache Hive
02-11-2017
12:21 AM
Could you let me know if the data is not in Quto's, How to Handle it and below is the example column 1| column 2 first|second|last In the above example the first|second are actually one column. Could you let me know how to handle if the data is not in quoto's and if the delimiter is part of the data. Any suggestion or help is appreciated.
... View more
02-09-2017
03:03 AM
Thanks and Yah .. Open Csv serde will do it.However, i am looking if there any other alternatives.
... View more
02-08-2017
06:35 AM
In hive, One of my column data contains Pipe as the part of the data('|'), however, while exporting data from this table, we need to export the data as the pipe('|') as the delimiter between each fields, How to handle if the delimiters as part of the data while creating the flat file from the hive table.
... View more
Labels:
- Labels:
-
Apache Hive