Member since
08-30-2017
83
Posts
0
Kudos Received
0
Solutions
12-06-2017
01:52 PM
@Muhammed Yetginbal Hello, I'm facing the same issue by following the same tutorial mentioned in: https://hortonworks.com/tutorial/hadoop-tutorial-getting-started-with-hdp/section/4/. Did you find a solution for this issue? I'll be grateful if you could help me.
... View more
12-06-2017
01:25 PM
@Artem Ervits @grajagopal @Geoffrey Shelton Okot Hello, I'm facing the same issue but by following the tutorial mentioned in: https://hortonworks.com/tutorial/hadoop-tutorial-getting-started-with-hdp/section/4/. Once i execute my pig script, it is stuck in running status as mentioned in status.png. From RM UI, my application is also stuck in Running status as shown in rm-application.png and i attached the launched job in MapReduce in mr-job.png. From pig view log, i got hive-log.png. How can i resolve my issue? I'll be really grateful if you could help me.
... View more
12-06-2017
11:56 AM
@Mike Vogt @Lester Martin @Rafael Coss Hello, I'm facing the same issue but by following the tutorial mentioned in: https://hortonworks.com/tutorial/hadoop-tutorial-getting-started-with-hdp/section/4/. Once i execute my pig script, it is stuck in running status as mentioned in status.png. From RM UI, my application is also stuck in Running status as shown in rm-application.png and i attached the launched job in MapReduce in mr-job.png. From pig view log, i got hive-log.png. How can i resolve my issue? I'll be really grateful if you could help me.
... View more
12-05-2017
03:56 PM
@Mike Bit Thank you for your reply. I've tried many suggested solutions through forums but without any positive result... i will keep you informed of any updates.
... View more
12-05-2017
03:46 PM
@Mike Bit
Did you find a solution for this problem? I'm facing the same issue..
... View more
12-05-2017
02:51 PM
@Mike Bit Hello, I'm facing the same issue and i've explained it through this question: https://community.hortonworks.com/questions/148295/how-to-resolve-pig-error-file-not-found-exception.html. How did you resolve your issue? I really need your help. I'm stuck and i didn't find any convenient solution. I'll be grateful if you could help me.
... View more
11-30-2017
09:30 AM
@Edgar Orendain I've set "yarn.resourcemanager.max-completed-applications" to 10 and
i've restarted my ambari-server. I got the same total of yarn
applications through RM UI. Is there any step missed ? I'll be grateful if you continue helping me fixing this issue.
... View more
11-29-2017
09:23 AM
@Constantin Stanca @Dinesh Chitlangia Hello, I'm facing the same issue and i've explained it through this question: https://community.hortonworks.com/questions/148642/how-can-i-remove-yarn-pplications-from-ambari-serv.html. I've set "yarn.resourcemanager.max-completed-applications" to 10 and i've restarted my ambari-server. I got the same total of yarn applications through RM UI. Is there any step missed ? I'll be grateful if you could help me fixing this issue.
... View more
11-28-2017
04:50 PM
@Edgar Orendain Thank you for your reply. I'll check the links above and I'll give you my feedback.
... View more
11-28-2017
02:34 PM
Hello, I need to remove some yarn applications from resource manager UI. As shown in rm-ui.png, i have 101 applications. So, how can i keep the recent ones and delete some of them. I really need your help. I'll be grateful if someone could help me to resolve this issue.
... View more
Labels:
11-23-2017
09:36 AM
Hello, I'm trying to run a script through Pig view. I attached my script details in script.png When i run the script, i got job failed to start as mentioned in script.png.The error from Pig view is detailed in error.png I've also attached the error i got from /var/log/ambari-server/pig-view/pig-view.log in pig-log.png and the error i got from /var/log/hadoop-yarn/yarn/yarn-yarn-resourcemanager-ambari.log in rm-log.png. PS: My cluster is kerbereized, my ambari version is 2.5.1.0, my HDP version is 2.6.1.0-129. How can i resolve this issue? I'll really grateful if someone could help me.
... View more
Labels:
11-22-2017
03:15 PM
Hello, I'm facing the same issue. @Amit Sharma @Stefan Schuster How did you process to resolve this issue? I've detailedly explained my issue here : https://community.hortonworks.com/questions/148295/how-to-resolve-pig-error-file-not-found-exception.html. When i run the script, no job is launched in /user/admin/pig/jobs on the HDFS. what could be the cause of this issue? Please help me resolving this issue. I'm really stuck.
... View more
11-22-2017
02:27 PM
@Jay Kumar SenSharma I checked the link and i verified that cache is set to false in app.js. So, what could be the cause of my issue?
... View more
11-22-2017
01:51 PM
@Jay Kumar SenSharma Thank you for your quick reply. My ambari version is Version
2.5.1.0. I'll check the link above and I'll give you my feedback.
... View more
11-22-2017
01:32 PM
Hello, I'm trying to run a script through Pig View. I attached my script details in script.png. When i run this script i got the error mentioned in error.png through /var/log/ambari-server/pig-view/pig-view.log. The directory /user/admin/pig/jobs/ exists on HDFS and i've set 777 to this directory as mentioned in hdfs.png. I access to Ambari UI with admin as username. How can i resolve this issue? I'll be really grateful if someone could help me.
... View more
Labels:
11-22-2017
10:30 AM
@Jay Kumar SenSharma Thank you very much. Your suggestion worked for me.
... View more
11-22-2017
08:39 AM
@Jay Kumar SenSharma Thank you very much for your reply. - My ambari version is Version
2.5.1.0. - My ambari server is running with root user. So,i resecured my cluster with the principal "root@ROSAFI.COM" and made all the necessary configurations by following this link:
https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.1.0/bk_ambari-security/content/set_up_kerberos_for_ambari_server.html. The output of file "/etc/ambari-server/conf/krb5JAASLogin.conf" is: com.sun.security.jgss.krb5.initiate {
com.sun.security.auth.module.Krb5LoginModule required
renewTGT=false
doNotPrompt=true
useKeyTab=true
keyTab="/etc/security/keytabs/root.server.keytab"
principal="root@ROSAFI.COM"
storeKey=true
useTicketCache=false;
}; - I've also made the necessary configuration in Pig view as mentioned in pig-auth.png. - I've set "hadoop.proxyuser" in HDFS as shown in proxyuser.png. I'm really stuck. Please help me understanding the cause of this issue.
... View more
11-21-2017
04:44 PM
Hello, I need to test a script through Pig View on my kerbereized cluster. I attached my script details in script.png. When i run this script i got the error mentioned in error.png through /var/log/ambari-server/pig-view/pig-view.log. I access to Ambari UI with admin as username. How can i resolve this issue? I'll be really grateful if someone could help me.
... View more
Labels:
11-20-2017
09:40 AM
@Robert Levas Thank you very much for your reply. - I'm working on local domain. So, i've not a FQDN. My cluster is composed
of two hosts whose hostnames are ambari and ambari-slave1. All ambari principals were automatically created when i enabled kerberos on my cluster, as shown in principals.png. - For hue, I've created manually a hue principal by following this method:Where $FQDN is the host name of the Hue server and EXAMPLE.COM is the
Hadoop realm, create a principal for the Hue server: # kadmin.local
kadmin.local: addprinc -randkey hue/$FQDN@EXAMPLE.COM
Where $FQDN is the host name of the Hue server and EXAMPLE.COM is the Hadoop realm
My hue server is running with hue user. The $FQDN is the host name of the Hue server which is ambari in my case and ROSAFI.COM is the
Hadoop realm. My questions are: - Is the FQDN necessary for enabling kerberos? Is that the major cause of my issue?
... View more
11-17-2017
01:37 PM
@Robert Levas Hello, I'm facing a similar issue with Hue. I've detailedly explained my question here: https://community.hortonworks.com/questions/147826/failed-to-access-filesystem-root-through-hue-ui.html. The output of $klist with hue user is mentioned in klist.png. Could you please help me to resolve this issue? I'll be really grateful.
... View more
11-17-2017
08:38 AM
Hello @Vandana K R I'm facing the same issue. I can't access any interface (NameNode UI, Oozie UI, ResourceManager UI ...) after enabling kerberos on my cluster. I've detailedly explained my problem here: https://community.hortonworks.com/questions/147990/my-namenode-ui-is-not-accessible-on-my-kerbereized.html. I didn't understand what does he mean in " the computer who runs the browser needs to be in a trusted kerberos realm"? What is the necessary configuration to do in order to ensure this suggestion? How did you resolve this issue? I'll be really grateful if you could help me resolving this issue.
... View more
11-16-2017
04:35 PM
Hello, I've successfully enabled kerberos on my cluster. All the services are running correctly. When i try to access to my NameNode UI by taping http://ambari:50070 on my browser, i got the error mentioned in error.png. - My ambari server is running with root user. So,i resecured my cluster with root@MYREALM and made all the necessary configurations by following this link: https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.1.0/bk_ambari-security/content/set_up_kerberos_for_ambari_server.html. - I've also enabled SPNEGO authentication for Hadoop by following these links: https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.1.1/bk_Ambari_Security_Guide/content/_configuring_http_authentication_for_HDFS_YARN_MapReduce2_HBase_Oozie_Falcon_and_Storm.html and https://developer.ibm.com/hadoop/2016/09/28/securing-hadoop-user-interfaces-kerberos-enable/. - I'm working on local domain. So, i've not FQDN. My cluster is
composed of two hosts whose hostnames are ambari and ambari-slave1. Hence, i
didn't set hadoop.http.authentication.cookie.domain in Custom core-site. - I've enabled browser access to a SPNEGO-enabled web UI by following this link: https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.0.0/bk_ambari-security/content/enabling_browser_access_spnego_web_ui.html. At this step, i've skipped 1. 2. and 3. because my kerberos is already installed and automatically configured. The 4. sub-step is done as mentioned in step.png. My question are: - Which eventual steps are missed? - How can i resolve this issue? I'm really stuck at this final step of securing my cluster. I'll be grateful if someone could help me.
... View more
Labels:
11-15-2017
08:28 AM
@Geoffrey Shelton Okot Sorry for the delay. HDP cluster is 2.6.1 and is kerberized. I've also install on it Ranger and Ranger KMS. My Os is Ubuntu 14.04. I've detailly explained my question here: https://community.hortonworks.com/questions/147826/failed-to-access-filesystem-root-through-hue-ui.html
... View more
11-14-2017
03:04 PM
@Geoffrey Shelton Okot I've already checked it and made the necessary configurations to configure Hue with kerberos but i still get the same errors. I'm really stuck. How can i let hue authenticate with my components cluster?
... View more
11-14-2017
02:54 PM
Hello, I've installed Hue on my cluster. When i access to my Hue UI i got the errors mentioned in "errors.png". Hue is running through supervisor with hue user as mentioned in "hue.png". It was running correctly until i enabled kerberos on my cluster. I've made the necessary configurations to configure Hue with kerberos by following these two links: https://www.cloudera.com/documentation/enterprise/5-6-x/topics/cdh_sg_hue_kerberos_config.html and https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.2/bk_security/content/kerb-config-hue.html. I've also enabled Ranger and Ranger KMS on my cluster. When i access to /var/log/supervisor/hue.log, i got the error mentioned in "kerberos_log.png". I was able to access manually to hdfs root file system with hue user as shown in "hdfs_access.png". My question are : 1- How can hue user access to hdfs root file system on kerbereized cluster through Hue UI? 2- How can hue authenticate with my cluster components? I'll be really grateful if someone could hep me resolve this issue.
... View more
Labels:
11-14-2017
08:13 AM
@Geoffrey Shelton Okot Hello, the output of $klist as hue user is: Ticket cache: FILE:/tmp/krb5cc_1019 Default principal: hue/ambari@ROSAFI.COM Valid starting Expires Service principal 11/13/17 14:17:25 11/14/17 00:17:25 krbtgt/ROSAFI.COM@ROSAFI.COM renew until 11/20/17 14:17:25
I've tried to grab a kerberos ticket as mentioned above but i still get the same error when i access to my Hue UI.
... View more
11-13-2017
02:54 PM
Hello, I've enabled kerberos on my cluster and i've installed on it Hue. Hue was running correctly until kerberos is enabled. I've configured kerberos for Hue by following this link: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.2/bk_security/content/kerb-config-hue.html When i accessed to my Hue UI and i navigate through it to Pig, Hive ... I got the errors mentioned in "error1.png" and "error2.png". How can i fix this issue?
... View more
Labels:
11-08-2017
03:53 PM
Hello, I need to get JSON responses for specific services through Ambari API. For example, when i try the following service: curl -u admin:admin http://$AMBARI_SERVER:8080/api/v1/hosts I get the following JSON response: { "href" : "http://xxx.xxx.xxx.xxx:8080/api/v1/hosts", "items" : [ { "href" : "http://xxx.xxx.xxx.xxx:8080/api/v1/hosts/ambari", "Hosts" : { "cluster_name" : "cluster1", "host_name" : "ambari" } } ] } However, for the following service which is responsible for adding a host to cluster: curl --user admin:admin -i -H 'X-Requested-By: ambari' -X POST http://$AMBARI_SERVER:8080/api/v1/clusters/cluster1/hosts/ambari, I only get the status code of the request. Is there a way to extract the JSON response? Is there a parameter to add in the curl service to extract the JSON response? I'll be grateful if someone could help me to resolve my issue.
... View more
Labels:
10-31-2017
01:40 PM
@vperiasamy Thank you for your reply. I've skipped the first sub-step related to AES NI CPU and i proceeded with the second sub-step for libcrypto. When i tap $ hadoop checknative i got the same output mentioned in the tutorial. So, everything is working correctly.
... View more
10-31-2017
11:02 AM
Hello, I'm trying to configure and use HDFS data at rest
encryption. I'm stuck at the step of preparing the environment as
mentioned in the following link: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.2/bk_security/content/AES-NI-CPU-support.html. When i do $ cat /proc/cpuinfo | grep aes i got nothing in return. So, AES NI is not enabled on my CPU. My questions are: 1-
Is it possible to skip all these steps mentioned in the section
"Prepare the environment" and proceed the next step which is "Create an
Encryption Key"? If no: 2- How can enable AES NI CPU for the first sub-step? 3- How can i install version libcrypto.so mentioned in the second sub-step on my Ubuntu 14.04? I'll be grateful if someone could help me.
... View more