Member since
02-15-2016
33
Posts
6
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4722 | 01-18-2018 01:39 PM | |
3911 | 07-06-2017 09:57 AM | |
5364 | 05-24-2017 01:31 PM |
07-23-2020
07:37 AM
We got the same issue and resolve as below: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask Check for more details and got error as : "Unexpected end of input stream" Now, Get the hdfs LOCATION for the table by using below command on HUE or HIVE shell: show create table <table-name>; Check for the zero byte size files and remove them from hdfs location using below command: hdfs dfs -rm -skipTrash $(hdfs dfs -ls -R <hdfs_location> | grep -v "^d" | awk '{if ($5 == 0) print $8}') Try running again our query which ran successfully this time.
... View more
07-03-2020
11:09 PM
This is OS related issue, Cloudera does not support all types of OS. Here the OS list supported by Cloudera. Red Hat Enterprise Linux ( 6 and 7) Oracle Enterprise Linux ( 6 and 7) Centos ( 6 and 7) SUSE Enterprise Linux 12 Ubuntu (16.04 LTS and 18.04 LTS) Check your OS version: cat /etc/os-release
... View more
04-10-2019
02:34 AM
Though I don't know how it works exactly under the hood, I can confirm that it will work on the source DB side. (As it will definitely NOT simply pull everything from the DB, and then chop it up before writing to Hadoop.) If you are looking for the optimum, you are likely going to need some trial and error. However, as a starting point I understand that the default value is 1000, and that you may want to try 10000 as a first step towards better performance.
... View more
05-06-2018
12:02 AM
When a Cloudera Enterprise license expires, the following occurs: Cloudera Enterprise Enterprise Data Hub Edition Trial - Enterprise features are no longer available. Cloudera Enterprise - Cloudera Manager Admin Console displays a banner indicating license expiration. Contact Cloudera Support to receive an updated license. In the meanwhile, all enterprise features will continue to be available. This has been documented here.
... View more
01-25-2018
11:50 PM
Yep, we used it for estimating required memory. For now our block capacity is ok, thank you for help.
... View more
07-20-2017
10:21 AM
Thanks Tristan! I had found that mistake and corrected it. Thanks for your response. Regards, MG
... View more
07-06-2017
09:57 AM
1 Kudo
Hi Everyone, Not sure if anyone else faced this issue, but after much research I was able to connect to Kerberized Hive successfully. I appended "-Djavax.security.auth.useSubjectCredsOnly=false" to the "jinit" .jinit(classpath=cp, parameters="-Djavax.security.auth.useSubjectCredsOnly=false") Basically, it disables the requirement of a GSS mechanism to obtain necessary credentials from an existing Subject and allows to use the specified authentication mechanism, which in this case is Kerberos.
... View more
05-25-2017
12:39 PM
Hi Tristan, You are right, configuring it globally was much easier, but we have tenant specific queues and we want to keep them contained within their pools, which is why we needed Engine/Project specific setting. Anyways, thanks for your response. Regards, MG
... View more