Member since
08-13-2018
2
Posts
0
Kudos Received
0
Solutions
08-19-2018
04:00 PM
Hi , Here I need to write the Hive table query result into a file with "\u001c" Or "\034" as a delimiter into HDFS using Beeline. I could achieve the same using Hive CLI. I am trying to execute the below query from Beeline, but in the output file all zunk characters are coming....is there any other way of achieving this in correct way? From Beeline : Not working beeline -u "jdbc:hive2://master:10000/;principal=hive/master@DOMAIN.NET" -e "INSERT OVERWRITE DIRECTORY '<HDFS_Path>/data.dat' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\034' select * from emp ;" From Hive CLI : Working fine: Hive> "INSERT OVERWRITE DIRECTORY '<HDFS_Path>/data.dat' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\034' select * from emp ;"
... View more
08-13-2018
07:58 AM
Hi, Trying to give Unicode value (\u001c) as a delimiter in Hive/Beeline output file, but not able to write the data with the special character value as a delimiter into file. In the output file, its taking the \ as a delimiter (i.e its considering the first character from '\u001c' as delimiter) Below are the commands using. What exactly the issue here? Is there any workaround to achieve this? Command: beeline -u "jdbc:hive2://master:10000/;principal=hive/master@DOMAIN.NET" --silent=true --showHeader=false --outputformat=dsv --delimiterForDSV='\u001c' -e select * from emp; | hadoop fs -appendToFile - /<HDFS_Path>/data.dat Note: If I give single character as delimiter (eg: ~) its working fine, but with multiple characters as delimiter...its not working as expected.
... View more
Labels:
- Labels:
-
Apache Hive