Member since
β12-21-2016
10
Posts
1
Kudos Received
0
Solutions
β02-01-2018
05:36 PM
Nothing, it was a testing cluster. Was faster to reinstalling it. π
... View more
β01-16-2018
08:57 PM
Hello, Infra Solr Instance Start fails with this stack: Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/infra_solr.py", line 123, in <module>
InfraSolr().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/infra_solr.py", line 46, in start
self.configure(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 119, in locking_configure
original_configure(obj, *args, **kw)
File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/infra_solr.py", line 41, in configure
setup_infra_solr(name = 'server')
File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/setup_infra_solr.py", line 101, in setup_infra_solr
mode=0640)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 123, in action_create
content = self._get_content()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 160, in _get_content
return content()
File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 52, in __call__
return self.get_content()
File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 144, in get_content
rendered = self.template.render(self.context)
File "/usr/lib/python2.6/site-packages/ambari_jinja2/environment.py", line 891, in render
return self.environment.handle_exception(exc_info, True)
File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/templates/infra-solr-security.json.j2", line 28, in top-level template code
"{{atlas_kerberos_service_user}}@{{kerberos_realm}}": ["{{infra_solr_role_atlas}}", "{{infra_solr_role_ranger_audit}}", "{{infra_solr_role_dev}}"],
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 73, in __getattr__
raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
resource_management.core.exceptions.Fail: Configuration parameter 'kerberos-env' was not found in configurations dictionary! Starting log: 2018-01-16 15:51:22,298 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-01-16 15:51:22,588 - Stack Feature Version Info: stack_version=2.6, version=2.6.2.0-205, current_cluster_version=2.6.2.0-205 -> 2.6.2.0-205
2018-01-16 15:51:22,598 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2018-01-16 15:51:22,600 - Group['livy'] {}
2018-01-16 15:51:22,603 - Group['spark'] {}
2018-01-16 15:51:22,604 - Group['ranger'] {}
2018-01-16 15:51:22,604 - Group['zeppelin'] {}
2018-01-16 15:51:22,604 - Group['hadoop'] {}
2018-01-16 15:51:22,604 - Group['nifi'] {}
2018-01-16 15:51:22,605 - Group['users'] {}
2018-01-16 15:51:22,605 - User['streamline'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:22,737 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:22,866 - User['registry'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:22,996 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:23,127 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:23,256 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:23,385 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:23,514 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:23,643 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger']}
2018-01-16 15:51:23,772 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2018-01-16 15:51:23,901 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop']}
2018-01-16 15:51:24,032 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:24,162 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:24,292 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:24,421 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:24,551 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2018-01-16 15:51:24,680 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:24,810 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:24,941 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:25,070 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:25,201 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:25,331 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:25,465 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:25,597 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2018-01-16 15:51:25,730 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-01-16 15:51:25,959 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-01-16 15:51:25,979 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2018-01-16 15:51:25,980 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-01-16 15:51:26,231 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-01-16 15:51:26,457 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-01-16 15:51:26,476 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2018-01-16 15:51:26,477 - Group['hdfs'] {}
2018-01-16 15:51:26,477 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2018-01-16 15:51:26,607 - FS Type:
2018-01-16 15:51:26,607 - Directory['/etc/hadoop'] {'mode': 0755}
2018-01-16 15:51:26,723 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2018-01-16 15:51:26,887 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-01-16 15:51:27,037 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2018-01-16 15:51:27,077 - Skipping Execute[('setenforce', '0')] due to only_if
2018-01-16 15:51:27,078 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2018-01-16 15:51:27,393 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2018-01-16 15:51:27,673 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2018-01-16 15:51:27,897 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
2018-01-16 15:51:28,059 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'root'}
2018-01-16 15:51:28,228 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2018-01-16 15:51:28,431 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-01-16 15:51:28,588 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2018-01-16 15:51:28,809 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2018-01-16 15:51:28,942 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2018-01-16 15:51:29,118 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2018-01-16 15:51:29,715 - Directory['/var/log/ambari-infra-solr'] {'owner': 'infra-solr', 'group': 'hadoop', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2018-01-16 15:51:30,044 - Directory['/var/run/ambari-infra-solr'] {'owner': 'infra-solr', 'group': 'hadoop', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2018-01-16 15:51:30,374 - Directory['/opt/ambari_infra_solr/data'] {'owner': 'infra-solr', 'group': 'hadoop', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2018-01-16 15:51:30,704 - Directory['/opt/ambari_infra_solr/data/resources'] {'owner': 'infra-solr', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2018-01-16 15:51:31,087 - Directory['/usr/lib/ambari-infra-solr'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'recursive_ownership': True, 'owner': 'infra-solr', 'mode': 0755}
2018-01-16 15:51:31,485 - Directory['/etc/ambari-infra-solr/conf'] {'group': 'hadoop', 'cd_access': 'a', 'create_parents': True, 'mode': 0755, 'owner': 'infra-solr', 'recursive_ownership': True}
2018-01-16 15:51:31,846 - File['/var/log/ambari-infra-solr/solr-install.log'] {'content': '', 'owner': 'infra-solr', 'group': 'hadoop', 'mode': 0644}
2018-01-16 15:51:32,060 - File['/etc/ambari-infra-solr/conf/infra-solr-env.sh'] {'owner': 'infra-solr', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0755}
2018-01-16 15:51:32,253 - File['/opt/ambari_infra_solr/data/solr.xml'] {'owner': 'infra-solr', 'content': InlineTemplate(...), 'group': 'hadoop'}
2018-01-16 15:51:32,415 - File['/etc/ambari-infra-solr/conf/log4j.properties'] {'owner': 'infra-solr', 'content': InlineTemplate(...), 'group': 'hadoop'}
2018-01-16 15:51:32,575 - File['/etc/ambari-infra-solr/conf/custom-security.json'] {'owner': 'infra-solr', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0640}
2018-01-16 15:51:32,767 - Execute['ambari-sudo.sh JAVA_HOME=/usr/jdk64/jdk1.8.0_112 /usr/lib/ambari-infra-solr-client/solrCloudCli.sh --zookeeper-connect-string zk1.domain.com:2181,zk2.domain.com:2181,zk3.domain.com:2181 --znode /infra-solr --create-znode --retry 30 --interval 5'] {}
2018-01-16 15:51:33,252 - File['/etc/ambari-infra-solr/conf/infra_solr_jaas.conf'] {'owner': 'infra-solr', 'content': Template('infra_solr_jaas.conf.j2')}
2018-01-16 15:51:33,423 - File['/etc/ambari-infra-solr/conf/security.json'] {'owner': 'infra-solr', 'content': Template('infra-solr-security.json.j2'), 'group': 'hadoop', 'mode': 0640}
Command failed after 1 tries
Nothing changed, any ideas?
... View more
- Tags:
- Data Processing
- solr
Labels:
- Labels:
-
Apache Solr
β01-15-2018
06:45 PM
currently getting the same... did you manage to resolve this issue? If yes, how? Thanks Bruno
... View more
β12-14-2017
02:51 PM
Hello, Why is it not possible to use site-to-site with an input port inside a process group? I want to achieve this because we have tenants with their own process groups and they want to use the site-to-site inside the same cluster pattern to achieve the load balancing. A lot of processors must be run on the master node only and must distribute load across the whole cluster (ListFiles -> FetchFiles -> ProcessFiles -> ...). I'm forced to create input port on the root Flow Canvas and it breaks my multi-tenancy autonomy... Any way to make tenants totally autonomous? Thanks Bruno
... View more
Labels:
- Labels:
-
Apache NiFi
β09-20-2017
02:49 PM
Hello, I just installed the Ambari Infra service on 3 nodes and when I try to access to any of Solr Admin UI, it stops me with an error: HTTP ERROR 401
Problem accessing /solr/. Reason:
Authentication required How to access this UI ? Thanks in advance Bruno
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Solr
β09-12-2017
06:15 PM
Resolved that, by adding one command to sudoers: /bin/su hdfs *
... View more
β09-12-2017
02:53 PM
Hello,
As directed in the guide, I activated the automatic user home directory creationΒ. Our cluster is Kerberized and Ambari running as non-root user and we correctly set sudoers content as told in the docs. The original allowed commands in sudoers file are: # Ambari Commands
ambari ALL=(ALL) NOPASSWD:SETENV: /bin/mkdir -p /etc/security/keytabs, /bin/chmod * /etc/security/keytabs/*.keytab, /bin/chown * /etc/security/keytabs/*.keytab, /bin/chgrp * /etc/security/keytabs/*.keytab, /bin/rm -f /etc/security/keytabs/*.keytab, /bin/cp -p -f /var/lib/ambari-server/data/tmp/* /etc/security/keytabs/*.keytab But, when i replace this line with a more permissive one: ambari ALL=(ALL) NOPASSWD: ALL The hook works correctly! π I searched and didn't find anyone else with this problem... Ambari server log when it doesn't work, nothing very special: 12 Sep 2017 09:58:25,670 INFO [pool-18-thread-11] UserHookService:107 - Executing user hook for BatchUserHookContext{userGroups={ul-svd-user23=[]}}.
12 Sep 2017 09:58:25,670 INFO [pool-18-thread-11] UserHookService:123 - Triggering user hook for user: BatchUserHookContext{userGroups={ul-svd-user23=[]}}
12 Sep 2017 09:58:25,670 INFO [pool-3-thread-1] UserHookService:131 - Preparing hook execution for event: UserCreatedEvent{eventType=USER_CREATED}
12 Sep 2017 09:58:25,684 ERROR [ambari-action-scheduler] ActionScheduler:754 - Execution command has no timeout parameter{"clusterName":"exp2","requestId":170,"stageId":-1,"taskId":2408,"commandId":"170--1","hostname":"_internal_ambari","role":"AMBARI_SERVER_ACTION","hostLevelParams":{},"roleParams":{"ACTION_USER_NAME":"ambari","ACTION_NAME":"org.apache.ambari.server.serveraction.users.PostUserCreationHookServerAction"},"roleCommand":"EXECUTE","clusterHostInfo":{},"configurations":{},"configuration_attributes":{},"configurationTags":{},"forceRefreshConfigTagsBeforeExecution":false,"commandParams":{"cmd-hdfs-principal":"hdfs-exp2@DOMAIN.COM","cmd-input-file":"/var/lib/ambari-server/data/tmp/user_hook_input_1505224705673.csv","cluster-security-type":"KERBEROS","cmd-hdfs-user":"hdfs","cmd-payload":"{\"ul-svd-user23\":[]}","cmd-hdfs-keytab":"/etc/security/keytabs/hdfs.headless.keytab","hook-script":"/var/lib/ambari-server/resources/scripts/post-user-creation-hook.sh","cluster-name":"exp2","cluster-id":"2"},"serviceName":"","kerberosCommandParams":[],"localComponents":[],"availableServices":{},"commandType":"EXECUTION_COMMAND"}
12 Sep 2017 09:58:25,691 INFO [Server Action Executor Worker 2408] PostUserCreationHookServerAction:134 - Validating command parameters ...
12 Sep 2017 09:58:25,691 INFO [Server Action Executor Worker 2408] PostUserCreationHookServerAction:161 - Command parameter validation passed.
12 Sep 2017 09:58:25,692 INFO [Server Action Executor Worker 2408] CsvFilePersisterService:108 - Persisting map data to csv file
12 Sep 2017 09:58:25,692 INFO [Server Action Executor Worker 2408] CsvFilePersisterService:84 - Persisting collection to csv file
12 Sep 2017 09:58:25,692 INFO [Server Action Executor Worker 2408] CsvFilePersisterService:88 - Collection successfully persisted to csv file.
12 Sep 2017 09:58:25,693 INFO [Server Action Executor Worker 2408] ShellCommandUtilityWrapper:48 - Running command: /var/lib/ambari-server/resources/scripts/post-user-creation-hook.sh
12 Sep 2017 09:58:25,749 INFO [Server Action Executor Worker 2408] PostUserCreationHookServerAction:104 - Execution of command [ [/var/lib/ambari-server/resources/scripts/post-user-creation-hook.sh, /var/lib/ambari-server/data/tmp/user_hook_input_1505224705673.csv, KERBEROS, hdfs-exp2@DOMAIN.COM, /etc/security/keytabs/hdfs.headless.keytab, hdfs] ] - succeeded
12 Sep 2017 09:58:25,749 INFO [Server Action Executor Worker 2408] PostUserCreationHookServerAction:108 - BEGIN - stdout for command [/var/lib/ambari-server/resources/scripts/post-user-creation-hook.sh, /var/lib/ambari-server/data/tmp/user_hook_input_1505224705673.csv, KERBEROS, hdfs-exp2@DOMAIN.COM, /etc/security/keytabs/hdfs.headless.keytab, hdfs]
12 Sep 2017 09:58:25,749 INFO [Server Action Executor Worker 2408] PostUserCreationHookServerAction:110 - command output *** : 0
debug: OFF
Executing user hook with parameters: /var/lib/ambari-server/data/tmp/user_hook_input_1505224705673.csv KERBEROS hdfs-exp2@DOMAIN.COM /etc/security/keytabs/hdfs.headless.keytab hdfs
The cluster is secure, calling kinit ...
Executing command: [ /var/lib/ambari-server/ambari-sudo.sh su 'hdfs' -l -s /bin/bash -c '/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-exp2@DOMAIN.COM' ]
Checking for required tools ...
Executing command: [ /var/lib/ambari-server/ambari-sudo.sh su 'hdfs' -l -s /bin/bash -c 'type hadoop > /dev/null 2>&1 || { echo >&2 "hadoop client not installed"; exit 1; }' ]
Executing command: [ /var/lib/ambari-server/ambari-sudo.sh su 'hdfs' -l -s /bin/bash -c 'hadoop fs -ls / > /dev/null 2>&1 || { echo >&2 "hadoop dfs not available"; exit 1; }' ]
Checking for required tools ... DONE.
Processing post user creation hook payload ...
Generating json file /var/lib/ambari-server/data/tmp/user_hook_input_1505224705673.csv.json ...
Processing user name: ul-svd-user23
Generating file /var/lib/ambari-server/data/tmp/user_hook_input_1505224705673.csv.json ... DONE.
Processing post user creation hook payload ... DONE.
Executing command: [ /var/lib/ambari-server/ambari-sudo.sh su 'hdfs' -l -s /bin/bash -c 'yarn jar /var/lib/ambari-server/resources/stacks/HDP/2.0.6/hooks/before-START/files/fast-hdfs-resource.jar /var/lib/ambari-server/data/tmp/user_hook_input_1505224705673.csv.json' ]
debug: OFF
12 Sep 2017 09:58:25,749 INFO [Server Action Executor Worker 2408] PostUserCreationHookServerAction:112 - END - stdout for command [/var/lib/ambari-server/resources/scripts/post-user-creation-hook.sh, /var/lib/ambari-server/data/tmp/user_hook_input_1505224705673.csv, KERBEROS, hdfs-exp2@DOMAIN.COM, /etc/security/keytabs/hdfs.headless.keytab, hdfs]
When i try to call it manually with root, it creates the user directory: # /usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-exp2@DOMAIN.COM
# yarn jar /var/lib/ambari-server/resources/stacks/HDP/2.0.6/hooks/before-START/files/fast-hdfs-resource.jar /var/lib/ambari-server/data/tmp/user_hook_input_1505224705673.csv.json
Using filesystem uri: hdfs://experimentation2
Creating: Resource [source=null, target=/user/ul-svd-user23, type=directory, action=create, owner=ul-svd-user23, group=hdfs, mode=null, recursiveChown=false, recursiveChmod=false, changePermissionforParents=false, manageIfExists=true]
All resources created.
# hdfs dfs -ls /user
[...]
drwxr-xr-x - ul-svd-user23 hdfs 0 2017-09-12 10:36 /user/ul-svd-user23
When I change sudoers file to be more permissive for all commands, automatic creation is working well and the output is: 12 Sep 2017 10:45:10,235 INFO [Server Action Executor Worker 2409] PostUserCreationHookServerAction:110 - command output *** : 0
debug: OFF
Executing user hook with parameters: /var/lib/ambari-server/data/tmp/user_hook_input_1505227502368.csv KERBEROS hdfs-exp2@UL.CA /etc/security/keytabs/hdfs.headless.keytab hdfs
The cluster is secure, calling kinit ...
Executing command: [ /var/lib/ambari-server/ambari-sudo.sh su 'hdfs' -l -s /bin/bash -c '/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs-exp2@DOMAIN.COM' ]
Checking for required tools ...
Executing command: [ /var/lib/ambari-server/ambari-sudo.sh su 'hdfs' -l -s /bin/bash -c 'type hadoop > /dev/null 2>&1 || { echo >&2 "hadoop client not installed"; exit 1; }' ]
Executing command: [ /var/lib/ambari-server/ambari-sudo.sh su 'hdfs' -l -s /bin/bash -c 'hadoop fs -ls / > /dev/null 2>&1 || { echo >&2 "hadoop dfs not available"; exit 1; }' ]
Checking for required tools ... DONE.
Processing post user creation hook payload ...
Generating json file /var/lib/ambari-server/data/tmp/user_hook_input_1505227502368.csv.json ...
Processing user name: ul-svd-user24
Generating file /var/lib/ambari-server/data/tmp/user_hook_input_1505227502368.csv.json ... DONE.
Processing post user creation hook payload ... DONE.
Executing command: [ /var/lib/ambari-server/ambari-sudo.sh su 'hdfs' -l -s /bin/bash -c 'yarn jar /var/lib/ambari-server/resources/stacks/HDP/2.0.6/hooks/before-START/files/fast-hdfs-resource.jar /var/lib/ambari-server/data/tmp/user_hook_input_1505227502368.csv.json' ]
Using filesystem uri: hdfs://experimentation2
Creating: Resource [source=null, target=/user/ul-svd-user24, type=directory, action=create, owner=ul-svd-user24, group=hdfs, mode=null, recursiveChown=false, recursiveChmod=false, changePermissionforParents=false, manageIfExists=true]
All resources created.
debug: OFF
Changed original sudoers file for this, but not working too: ambari-server ALL=(ALL) NOPASSWD:SETENV: /bin/mkdir -p /etc/security/keytabs, /bin/chmod * /etc/security/keytabs/*.keytab, /bin/chown * /etc/security/keytabs/*.keytab, /bin/chgrp * /etc/security/keytabs/*.keytab, /bin/rm -f /etc/security/keytabs/*.keytab, /bin/cp -p -f /var/lib/ambari-server/data/tmp/* /etc/security/keytabs/*.keytab, /usr/bin/yarn Any ideas to solve this? Thanks Bruno
... View more
Labels:
- Labels:
-
Apache Ambari
β07-19-2017
04:36 PM
1 Kudo
Not working anymore for 2.6.1: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_command-line-installation/content/download-companion-files.html
... View more