Member since
04-12-2019
105
Posts
3
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
695 | 05-28-2019 07:41 AM | |
423 | 05-28-2019 06:49 AM | |
395 | 12-20-2018 10:54 AM | |
295 | 06-27-2018 09:05 AM | |
1398 | 06-27-2018 09:02 AM |
06-27-2018
09:02 AM
Hi, I have configured the sssd with AD server. Now i'm able to run query. Thanks Let me know if anyone having any query.
... View more
06-14-2018
05:32 AM
Hi @Vinicius Higa Murakami No, asif user does not exist on any Nodemanager/ResourceManager machines. I had setup-ldap(AD) with ambari-server. Then i had setup one way trust MIT KDC with AD. I believe i don't need to setup sssd service. I have defined asif user in ranger policy for use all queue. Even i had tried to add user on OS of ResourceManeger, i was able to execute query by asif. But i don't want to add each user on ResourceManager's OS.
... View more
06-13-2018
11:53 AM
Hi Folks, I have 3 node cluster in my environment, where i have configured kerberos & ranger integrated with Active Directory. I have setup one way trust MIT KDC. Now while i'm running hive query, i'm getting below logs: Diagnostics: Application application_1528875723692_0001 initialization failed (exitCode=255) with output: main : command provided 0 main : run as user is asif main : requested yarn user is asif User asif not found If i create user on OS server where yarn install, I'm able to run query. But i don't want to create user on OS because i have setup with AD. How yarn understand about AD user?
... View more
Labels:
06-13-2018
08:13 AM
Hi Folks, I have 3 node cluster in my environment, where i have configured kerberos & ranger integrated with Active Directory. I have setup one way trust MIT KDC. Users are able to get ticket. Now while i'm running hive query, i'm getting failed query and getting below logs: Application application_1528875723692_0001 failed 2
times due to AM Container for appattempt_1528875723692_0001_000002
exited with exitCode: -1000
For more detailed output, check the application
tracking page:
http://security-test3.example.com:8088/cluster/app/application_1528875723692_0001 Then click on links to logs of each attempt.
Diagnostics: Application
application_1528875723692_0001 initialization failed (exitCode=255) with
output: main : command provided 0
main : run as user is asif
main : requested yarn user is asif
User asif not found
Failing this attempt. Failing the application. Please advice what can i do for solve it. Regards, Vinay
... View more
Labels:
06-13-2018
06:49 AM
@Bhanu Pamu I'm using CentOS 7.3 No i don't any user on OS with same name. I'm using AD users. Yes, I'd created asif user directory in /user. I'm running query on internal hive table.
... View more
06-13-2018
06:24 AM
@Eric Leme According to you, we have to open connection from our Datanode to DB server. Edge is not single pain of glass for communication. Need more research.. Well Thanks 🙂 Regards, Vinay
... View more
06-13-2018
04:55 AM
@Felix Albani It's working. Thanks for quick response. Now i'm running query from end, I'm getting below error in hiveserver2.log Diagnostics: Application application_1528797729105_0016 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is asif
main : requested yarn user is asif
User asif not found I've configured Kerberos & ranger integrated with Active Directory. I've configured one way trust MIT KDC. asif user have own kerberos ticket. But still asif user not found. Can you further help me? Thanks.
... View more
06-12-2018
12:23 PM
HI Team, I have three node Cluster. I have enabled kerberos in cluster environment. While i'm running any hive query from different users, I'm getting only hive username in history server username who is running query. How can i enable owner name in application? sample screenshot attached. Regards, Vinay capture1.png
... View more
Labels:
06-08-2018
12:29 PM
@Geoffrey Shelton Okot Is it possible to see the Namenode metadata like if we need to see blocks information? edit files and fs-images files are encrypted file. How can we see metadata info? Regards, Vinay K
... View more
06-08-2018
12:26 PM
@anu s You must have credential of kerberos server. You can check configuration detail in /etc/krb5.conf and /var/kerberos on linux server. Regards Vinay K
... View more
06-08-2018
12:21 PM
@tsokorai I have increased the hard and soft value in limits.conf. Thanks
... View more
06-08-2018
09:51 AM
Hi All, I was running command hadoop fs -get /user/centos/dist/a.txt I got error during command java.net.SocketException: error creating UNIX domain socket with SOCK_STREAM: Too many open files Open file limit is default limit. I didn't change. Can some one suggest should i have to increase open file limit in sysctl and limit.conf? Thanks vinay
... View more
Labels:
06-08-2018
05:02 AM
Thanks @Felix Albani It's working.
... View more
06-05-2018
09:13 AM
@Sparsh Singhal Thanks for response and clear the point. Let me test. Will get back to you. Thanks
... View more
06-05-2018
09:11 AM
@Felix Albani As i understand, My solution is hidden in "mapped using auth_to_local". I have to specify policy in HDFS configuration for provide services access to users based. Correct me if i'm wrong.
... View more
06-04-2018
05:05 PM
@Felix Albani I agree with 2nd and 3rd answer. If we use user@AD.REALM for access kerberized service on cluster, how we define service access to user@AD.REALM? As i know, We don't need to create any service principal at AD server. Just we have to create trust with AD servers. Can you please help me to understand the concept? Regards, Vinay
... View more
06-04-2018
09:58 AM
Hi Folks, I have configured MIT kdc which is integrated with Active directory according to referred link: https://community.hortonworks.com/articles/59635/one-way-trust-mit-kdc-to-active-directory.html My question is: 1. How can i test one way trust is successfully created or not? 2. Users will persist on AD server and services will persist on hadoop cluster. Should i have to create user principal in kerberos database? 3. If yes, Should be have to add principal in kerberos manually whenever new user created in AD server? Regards, Vinay
... View more
05-29-2018
08:22 AM
Thanks Aditya..
... View more
05-28-2018
09:49 AM
Hi Folks, Hope all are doing well. I'm newer in spark. I have installed HDP 2.6.2. i have added spark as a service. Before start the spark, There was no job running. But when i had started spark service, i have found two jobs are running continuously in UNDEFINED state. #yarn application -list 18/05/28 15:07:51 INFO client.AHSProxy: Connecting to Application History server at 10.10.10.16:10200 18/05/28 15:07:51 INFO client.RequestHedgingRMFailoverProxyProvider: Looking for the active RM in [rm1, rm2]... 18/05/28 15:07:51 INFO client.RequestHedgingRMFailoverProxyProvider: Found active RM [rm2] Total number of applications (application-types: [] and states: [SUBMITTED, ACCEPTED, RUNNING]):4 Application-Id Application-Name Application-Type User Queue State Final-State Progress Tracking-URL
application_1527494556086_0039 org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 SPARK hive admin RUNNING UNDEFINED 10% http://10.10.10.8:4040
application_1527494556086_0038 org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 SPARK hive admin RUNNING UNDEFINED 10% http://10.10.10.12:4040 Thrift server is installed on both Server having IP are: 10.10.10.8 10.10.10.12 Can you please help me to clearify?
... View more
Labels:
05-21-2018
05:46 AM
Hi Folks, Hope all are doing well. I've dev setup of 11 node(2x NN, 8x DN, 1x edge node). Basically All nodes are connected by Cluster private(only 11 machine can access to each other) networking. We are using floating IP(public in only organization) on data node as well as on Edge node for access the data from DB. If we keep floating ip only on Edge node and run MapReduce job from edge node which is importing data from DB to HDFS, i'm getting error that DataNode IP is not able to access to source DB. Can someone suggest does we require Floating IP on all DataNode machine? Or we can use any other solution? Will be very thankful to you. Regards, Vinay K
... View more
Labels:
05-09-2018
05:31 AM
@Rajendra Manjunath i have installed HDF 3.0.3, which has installed properly using CLI. Now i'm stuck on install on.. in ambari. In HDF cluster name is none. while i click on Install on.. then click on Test_cluster, we directly redirected to HDP installed component. HDF is not available in installed component.
... View more
05-08-2018
09:25 AM
@Rajendra Manjunath I'm using Ambari 2.5.2 version. And im using mpack.
... View more
05-08-2018
08:39 AM
@Rajendra Manjunath While installing HDF in HDP, i got below error: Caused by: org.xml.sax.SAXParseException; systemId: file:/var/lib/ambari-server/resources/stacks/HDF/3.1/upgrades/nonrolling-upgrade-3.1.xml; lineNumber: 213; columnNumber: 94; cvc-complex-type.3.2.2: Attribute 'supports-patch' is not allowed to appear in element 'task'
... View more
05-08-2018
06:01 AM
we are having HDP 2.6.2 running properly. We are looking to install nifi in our environment. Is it recommended to install HDF on existing HDP? Have there any HDF version compatibility with HDP 2.6.2?
... View more
Labels:
04-18-2018
12:06 PM
Hi, we are using hdp 2.5.2 I'm newer in security part. I have integrated Kerberos unix based authentication(we are not using LDAP) where we have also configured SPNEGO for restrict the UI access, which is working fine. Now We are looking for knox, As per documentation, Knox provide SSO facility. I have few query based on knox 1. Is it mandatory to install AD/LDAP with KNOX? we want to run knox without AD/LDAP. 2. How know is useful in web based URL? Have there any specific link and configuration which i can understand. 3. Does we have to remove SPNEGO if we want to configure KNOX with SSO? Please assist.. Will be very helpful..
... View more
Labels:
04-13-2018
05:15 AM
Hi Geoffrey Shelton Okot I agree with your answer. These are the predefined roles and privileges which we can define to any user. If we require to User Define Roles(Custom) meanwhile we want to create subadmin user which can create user and Yarn Queue manager access.
... View more
04-13-2018
05:07 AM
yes, It's working fine. It's not an issue. command is executed successfully by same method. No changing required.
... View more
04-11-2018
12:56 PM
We are using ranger 0.6 version. We require to restrict user to control ambari console and user creation.
... View more
Labels:
04-10-2018
09:52 AM
Hi, I have a requirement to create custom workflow which will take input from user and run shell script behalf of argument passed by user. How can we create such workflow where user only provide value? We don't want to give whole workflow access. Even we want to create restricted Workflow. User should not have permission to delete workflow.
... View more
Labels:
- « Previous
- Next »