Member since
03-23-2015
1288
Posts
114
Kudos Received
98
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3339 | 06-11-2020 02:45 PM | |
5042 | 05-01-2020 12:23 AM | |
2841 | 04-21-2020 03:38 PM | |
3556 | 04-14-2020 12:26 AM | |
2337 | 02-27-2020 05:51 PM |
10-17-2017
03:41 AM
Can you try to kinit at the beginning of your script, in R e.g. system("kinit username@REALM.COM -k -t username.keytab") Have you generated keytab on your own or got it from admin?
... View more
10-10-2017
11:13 AM
You should specify file delimiter and new line character while creating table. ROW FORMAT DELIMITED FIELDS TERMINATED BY '\u0001' LINES TERMINATED BY '\n'
... View more
10-08-2017
02:52 AM
Hi, Full Hive documentation regarding table creation can be found here: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL I think that should be enough for you to start with.
... View more
10-07-2017
09:08 AM
Everybody thank you! >< I get it by command oozie job -oozie http://oozieHost:11000/oozie -config ~/.job.properties -run
... View more
10-06-2017
07:58 PM
Sqoop eval and Sqoop list tables requires only the Name node connection with the SQL server. But for SQOOP import, all the nodes in the cluster needs to have access to the remote SQL server. Here the Telnet failed from datanodes. It was due the network settings and firewalls configuration. Later when every node was able to access the Remote SQL server the SQOOP IMPORT was successful.
... View more
09-20-2017
04:53 AM
1 Kudo
Impala won't be able to create gzip compression format for text file. Please refer to below documentation: https://www.cloudera.com/documentation/enterprise/latest/topics/impala_file_formats.html It mentioned below: For text format, if LZO compression is used, you must create the table and load data in Hive. If other kinds of compression are used, you must load data through LOAD DATA, Hive, or manually in HDFS. So the short answer is that you can't do it in Impala.
... View more
09-19-2017
04:34 AM
Thanks for the info. I will check out CDSW as well.
... View more
09-18-2017
04:03 AM
1 Kudo
Nah, nothing is stupid, they are all questions that lots of people will face one day. Glad that I am helpful here :).
... View more
09-18-2017
02:35 AM
I am not aware of Impala can do that, you have need to write custom code to convert the result into JSON.
... View more
09-17-2017
11:52 PM
A hive table consists the following: 1. metadata info (all table and column definitions and HDFS location) 2. actual HDFS data stored in HDFS If you delete a managed table, both 1 and 2 will be deleted. However, if you delete an external table, then only 1 will be deleted, meaning, the table reference will be removed in Hive's backend database (show tables will not return the table and you can't query the table any more). The underlining HDFS file will remain on HDFS path untouched. To confirm this, you can check where the backend database is stored. If it is mysql, simply login and check the table under TBLS and check if you can query the table (mysql table, not hive table): SELECT * FROM TBLS WHERE TBL_NAME = "{your_table_name}"; Hope above helps.
... View more