Member since
02-12-2016
37
Posts
6
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
28246 | 08-08-2016 08:35 AM | |
5234 | 04-06-2016 03:29 PM | |
31219 | 02-25-2016 07:25 AM |
01-04-2019
10:25 AM
Thanks @cjervis For informations I get the same things with an Ubuntu 18. https://www.cloudera.com/documentation/enterprise/5-14-x/topics/cdh_ig_cdh5_install.html
... View more
08-08-2016
08:35 AM
1 Kudo
It works !! I find the workaround. After read again the log, I find weird the python error. So I would like to check again this error. The user zookeeper have the uid 167 and gid 153. I change the gid to 167 in /etc/passwd and /etc/group, and the db test works now !! I can now continue the setup. But I don't know why the setup wasn't good... Maybe a bug in Redhat package for the 5.7.1 version. I don't know. But finally, it works, I finish my setup and my cluster is now live ! Now I'g going to configure and optimize the configuration ! Maybe the post can help somebody. Regards,
... View more
04-06-2016
03:29 PM
Hi tseader, Sorry I wasn't avaiable ! For update, It works. The problem was the "Dynamic ressrouce pool". I create a resource pool for my username, and now the job is starting and runing. It was different from our Cloudera 4 in how it works... So now the job is runing, doing the sqoop and the hive job, and terminate successfuly ! Great news! But it very slow for a small table import, I think there is something to do in Dynamic resource pool or yarn setting to use more resource cause, during the job, cpu/emory of my 2 datanode was very less... Maybe you can give me some informations on how to calculate the the max container possible ? To give you some answer: - Yes sqoop was working alone. - Yes our analytics use <args> cause sometime in CDH4 with <command>, they were some error with specific caracters. - Now yes, sqoop/oozie/hive works now. We will try Impala now - No we doesn't try to create a workflow since Hue. I will see with our dev about that. - Not, didn't try with another db. As you thinking, the problem wasn't come from the workflow but the configuration. I'm new in Cloudera/Hadoop, so I learn! I discover the configuration with time! Now I've to find the best configuration to a better usage of our datanode... Thanks again tseader!
... View more
02-25-2016
07:25 AM
5 Kudos
It works! As we see in the outpulog, we see the HADOOP_CLASSPATH variable. Or we don't have any path for libs in hive directory... I try once to add in HADOOP_CLASSPATH the his folder but it doesn't works. The solution is to add the the folder and /* to take all jar... So I add this one in .bash_profile: export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/lib/hive/lib/* Then source ~/.bash_profile And now it works. Date were imported in Hive! Now we can continue our labs with Cloudera 5! Thanks!
... View more