Member since
05-30-2015
58
Posts
2
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4484 | 07-20-2018 02:00 AM | |
6848 | 05-23-2017 06:35 PM | |
1569 | 05-08-2017 09:08 PM | |
7316 | 05-07-2017 12:08 AM | |
2787 | 01-28-2016 10:53 PM |
05-08-2017
09:07 PM
Thanks for the information, will upgrade as soon as possible, at the mean time I am still exploring how to get external DB done on hive, oozie and those monitoring DB
... View more
05-07-2017
10:57 PM
Hi there, I would like to remove my cluster completely but there is no delete option under the drop down menu, is there anyway to force delete the whole cloudera installation? Thanks
... View more
Labels:
- Labels:
-
Cloudera Manager
05-07-2017
12:08 AM
I think I found the folder in Ubuntu, it is under /usr/share/cmf/common_jars, the mysql-connector-java-5.1.15.jar is there already.
... View more
05-06-2017
06:37 PM
I have found the mysql connector jar file under /usr/share/java but /usr/lib/hive/lib doesn't seem to be in the system.
... View more
05-06-2017
05:41 AM
Dear all, I installed cloudera 5.11 using Path A installation with Hive and sparks, I am trying to set up external database by following the guide for metastore external database setup, however I couldn't find the path to store the mysql connector jar file, has anyone face the same issue before? https://www.cloudera.com/documentation/enterprise/5-8-x/topics/cdh_ig_hive_metastore_configure.html
... View more
Labels:
- Labels:
-
Apache Hive
04-04-2016
01:32 AM
Thanks bodivijay! I have use another way to import the hive, will try your tools when I have time to revisit the same issue!
... View more
02-28-2016
05:53 PM
Thanks Harsh J, Will try that out!
... View more
02-28-2016
05:11 AM
Hi Harsh J, When I use the command #head -n1 file.csv | od -c It shows the following: 0000000 377 376 1 \0 \0 \0 \0 \0 \0 \0 0000020 \0 \0 \0 \0 \0 1 \0 6 \0 1 \0 0000040 8 \0 9 \0 7 \0 \0 \0 \0 \0 \0 0000060 \0 I \0 L \0 L \0 E \0 G \0 A \0 L \0 0000100 \0 D \0 U \0 M \0 P \0 \0 S \0 I \0 0000120 T \0 E \0 \0 ( \0 M \0 P \0 S \0 P \0 0000140 \0 W \0 A \0 T \0 C \0 H \0 \0 N \0 0000160 O \0 . \0 R \0 U \0 J \0 U \0 K \0 A \0 0000200 N \0 \0 1 \0 4 \0 / \0 0 \0 5 \0 / \0 0000220 1 \0 3 \0 3 \0 9 \0 ) \0 \0 A \0 d \0 0000240 a \0 l \0 a \0 h \0 \0 d \0 i \0 m \0 0000260 a \0 k \0 l \0 u \0 m \0 k \0 a \0 n \0 0000300 \0 b \0 a \0 h \0 a \0 w \0 a \0 \0 0000320 s \0 i \0 a \0 s \0 a \0 t \0 a \0 n \0 0000340 \0 M \0 P \0 S \0 P \0 \0 m \0 e \0 0000360 n \0 d \0 a \0 p \0 a \0 t \0 i \0 \0 0000400 k \0 a \0 w \0 a \0 s \0 a \0 n \0 \0 0000420 t \0 e \0 r \0 s \0 e \0 b \0 u \0 t \0 0000440 \0 m \0 e \0 r \0 u \0 p \0 a \0 k \0 0000460 a \0 n \0 \0 t \0 a \0 n \0 a \0 h \0 0000500 \0 h \0 a \0 k \0 \0 m \0 i \0 l \0 0000520 i \0 k \0 \0 p \0 e \0 r \0 s \0 e \0 0000540 n \0 d \0 i \0 r \0 i \0 a \0 n \0 . \0 0000560 \0 W \0 a \0 l \0 a \0 u \0 \0 b \0 0000600 a \0 g \0 a \0 i \0 m \0 a \0 n \0 a \0 0000620 p \0 u \0 n \0 , \0 \0 t \0 i \0 n \0 0000640 d \0 a \0 k \0 a \0 n \0 \0 t \0 e \0 0000660 l \0 a \0 h \0 \0 d \0 i \0 a \0 m \0 0000700 b \0 i \0 l \0 \0 d \0 e \0 n \0 g \0 0000720 a \0 n \0 \0 m \0 e \0 n \0 g \0 e \0 0000740 l \0 u \0 a \0 r \0 k \0 a \0 n \0 \0 0000760 s \0 a \0 t \0 u \0 \0 ( \0 1 \0 ) \0 0001000 \0 N \0 o \0 t \0 i \0 s \0 \0 d \0 0001020 i \0 \0 b \0 a \0 w \0 a \0 h \0 \0 0001040 U \0 n \0 d \0 a \0 n \0 g \0 - \0 U \0 0001060 n \0 \0 C \0 \0 \0 \0 \0 \0 0001100 \0 \0 \0 \0 \0 \0 H \0 A \0 0001120 F \0 I \0 Z \0 A \0 \0 \0 \0 \0 0001140 \0 \0 \0 \0 \0 \0 \0 \0 * 0001420 \0 \0 \0 2 \0 0 \0 1 \0 4 \0 - \0 0001440 0 \0 6 \0 - \0 0 \0 3 \0 \0 1 \0 6 \0 0001460 : \0 2 \0 5 \0 : \0 0 \0 0 \0 . \0 0 \0 0001500 0 \0 0 \0 \0 H \0 A \0 F \0 I \0 Z \0 0001520 A \0 \0 \0 \0 \0 \0 \0 \0 0001540 \0 \0 \0 \0 \0 \0 \0 \0 * 0002020 2 \0 0 \0 1 \0 4 \0 - \0 0 \0 6 \0 - \0 0002040 0 \0 5 \0 \0 1 \0 4 \0 : \0 2 \0 0 \0 0002060 : \0 4 \0 4 \0 . \0 0 \0 0 \0 0 \0 \r \0 0002100 \n 0002101 Thanks!
... View more
02-27-2016
10:36 PM
Hi Harsh J, I use the following command >create table if not exists test(a string, b string, c string, d string, e string) row format delimited fields terminated by '\t' escaped by '\\'; >load data local inpath '/path/file.csv' into table testdraft --hive-drop-import-delims; I have tried '\r', '\n' and '\001' at the row format fields terminated by statement but all give me same result. I open the csv file using notepad the columns are seperated by tab, gedit will recognise the tab as space, when I select * test in hive, I saw a lot of null between the columns. I also tried the following by replacing row format fields terminated by '\N' create table if not exists test(a string, b string, c string, d string, e string) row format delimited fields terminated by '\N'; then the column will be seperated by those words with n in it. I am pretty new with this csv, hive and DB, please shed some light. Thanks!
... View more
02-26-2016
04:44 AM
Hi all, I have created a table with the required columns in hive and stored as textfile. I use load data local inpath to put the csv file to the hive table created above, but all the data imported to 1 column, the rest of the columns are empty, I open the csv file in notepad, excel, seems like everything in order, I have use --hive-drop-delims and --fields-terminated-by '\t', the result is still the same. Anyone tried this before?
... View more
Labels:
- Labels:
-
Apache Hive
- « Previous
- Next »