Member since
09-10-2016
82
Posts
6
Kudos Received
9
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6170 | 08-28-2019 11:07 AM | |
5772 | 12-21-2018 05:59 PM | |
2863 | 12-10-2018 05:16 PM | |
2424 | 12-10-2018 01:03 PM | |
1610 | 12-07-2018 08:04 AM |
04-07-2021
03:02 AM
Huge thanks. It works for me.
... View more
06-29-2020
10:21 PM
@AnjaliRocks , I have sent you a PM for further details.
... View more
06-15-2020
09:37 AM
From jaas file I see that the debug=true was added, on the other hand, the debug is not showing up in the producer output, which means that the jaas file provided is not picker up properly. If you check the kafka-console-producer.sh you'll notice below lines: # check if kafka_jaas.conf in config , only enable client_kerberos_params in secure mode.
KAFKA_HOME="$(dirname $(cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd ))"
KAFKA_JAAS_CONF=$KAFKA_HOME/config/kafka_jaas.conf
if [ -f $KAFKA_JAAS_CONF ]; then
export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=$KAFKA_HOME/config/kafka_client_jaas.conf"
fi Try editing kafka_client_jaas.conf or also you can try to export using KAFKA_CLIENT_KERBEROS_PARAMS and see if that helps. Regards, Manuel.
... View more
02-04-2020
06:05 AM
@P_ As this is an older post you would have a better chance of receiving a resolution by starting a new thread. This will also provide the opportunity to provide details specific to your environment that could aid others in providing a more accurate answer to your question.
... View more
01-05-2020
04:15 AM
Hi, This seems to be a Kerberos authentication problem. is this issue happens only for a SHS2 UI or does it happens for other URLs too? You could do a1. kinit username 2. kinit -kt service.keytab 3. send keytab file along with the submit command Thanks AKR
... View more
12-17-2019
08:56 AM
Hello. I know this is pretty old question, but, as I had the same problem, I wanted to share. You can use PURGE instead of this non-existent property. Indeed, I cannot find this property in the official hive URL. You can use PURGE if you want to delete a table or a partition. More information there. Best regards.
... View more
12-09-2019
11:07 PM
Hi, Hope this document will clarify your doubts. This was a tuning document. https://blog.cloudera.com/how-to-tune-your-apache-spark-jobs-part-2/ Thanks AK
... View more
11-08-2019
10:23 PM
@sampathkumar_ma - In HDP, Hive's execution engine only supports MapReduce & Tez. Running with Spark is not supported in HDP at this current moment in time.
... View more
10-12-2019
10:10 PM
3 Kudos
Go to Ambari -> Hive -> Configs -> Advanced -> Custom hive-site Click Add Property Insert followings into the opening window: hive.security.authorization.sqlstd.confwhitelist.append=mapred.compress.map.output After saving, restart hive services. Then connect to beeline and set your parameter. I experienced a similar problem for mapreduce.job.reduces parameter and it worked.
... View more
07-12-2019
06:38 PM
Hi @Shu - Thanks for your response. Is there any way where we can enable the DFS commands where Ranger authorization is enabled? As of now, dfs commands are working in hive shell but not in beeline. Thank you.
... View more