Member since
10-10-2018
121
Posts
4
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
766 | 02-17-2022 07:22 AM |
02-17-2022
07:22 AM
Hi Yang, I would suggest you to open a support ticket for this one. Regards, Ariel Q.
... View more
01-13-2022
05:03 PM
thanks works for me HDP3.1.4 presto-340
... View more
12-15-2020
10:34 AM
Hello, We have exactly the same trouble. At first, be sure that Insert Overwite does not create the folder of your Hive table. Your folder has been created during the "CREATE TABLE..... LOCATION 'hdfs://titan/dev/10112/app/TC30/dataiku/CONFIG_ANOTHER_TEST/output'...." In fact, you are using an old version of DSS (Data Science Studio) from Dataiku. For us, we are using the 5.0.3 DSS version, knowing that at this time, 8.0.4 is the last release. As you said : " the vendor application first delete the folder then do the insert overwrite on the external table". That's true. But you have to know that Hive commands, like INSERT OVERWRITE or ANALYZE TABLE...COMPUTE STATISTICS, create a temporary folder. You have to look at this line of your log : Loading data to table dev_tc30_dataiku.config_another_test_output from hdfs://titan/tmp/.hive-staging_hive_2018-11-21_10-45-41_452_43360044430205414-24417/-ext-10000 An .hive_staging..... temporary folder has been created under /tmp directory. Please go into Ambari, and look at this Hive parameter hive.exec.stagingdir I am sure that this actual value is /tmp/.hive_staging Please modify this value into default value .hive_staging, and test again using DSS. Normally, Hive will now create a temporary folder under hdfs://titan/dev/10112/app/TC30/dataiku/CONFIG_ANOTHER_TEST/output/.hive-staging........ Of course, this .hive-staging... directory is temporary, but it is important to notice that the output folder has been recreated, and so the Insert Overwrite will be Ok. For us, we have the same trouble because we did an upgrade of our HDP cluster from 2.6.1 into 2.6.5 release; and we have to rollback the hive.exec.stagingdir parameter from /tmp/.hive_staging into .hive_staging Your ticket has been created in 11-21-2018, so at this date, the latest release of DSS was 5.0.3 Please notice that in next releases, Dataiku has modified this behaviour, and I don't think that now the Hive folder is always deleted. Regards, Gilles
... View more
01-05-2020
06:57 AM
Hi, Did you tried disabling SPNEGO authentication in Configuration properties and tried restarting the service? Thanks AKR
... View more
10-12-2018
02:44 PM
This is complex I believe your problem is you need to forward the traffic to/from the KDC to your Mac. You can do this by SSH tunnelling. That alone is not enough though since SSH port forwarding is only fit for TCP traffic and KDC traffic is UDP.
... View more
11-13-2018
06:33 PM
I think you need to delete those files as well, then it works... [root@centos10 krb5kdc]# ll total 28 -rw------- 1 root root 29 Nov 13 09:36 kadm5.acl -rw------- 1 root root 29 Nov 13 09:24 kadm5.acl.rpmsave -rw------- 1 root root 29 Nov 13 09:36 kadm5.acly -rw------- 1 root root 448 Nov 13 09:35 kdc.conf -rw------- 1 root root 448 Nov 13 09:24 kdc.conf.rpmsave -rw------- 1 root root 8192 Nov 13 09:27 principal <<<<<<<<<<<<<<<<< -rw------- 1 root root 0 Nov 13 09:37 principal.ok<<<<<<<<<<<<<<<<< then it works [root@centos10 ~]# /usr/sbin/kdb5_util create -r BEER.LOC -s Loading random data Initializing database '/var/kerberos/krb5kdc/principal' for realm 'BEER.LOC', master key name 'K/M@BEER.LOC' You will be prompted for the database Master Password. It is important that you NOT FORGET this password. Enter KDC database master key: Re-enter KDC database master key to verify: [root@centos10 ~]#
... View more