Member since
08-02-2017
11
Posts
0
Kudos Received
0
Solutions
04-13-2018
02:04 PM
i am not upgarding hdf version , this issue is occurred at time of nifi installtion
... View more
04-13-2018
01:34 PM
HI ALL, I Have SAME ISSUE TO INSTALL NIFI USING HDF3.1 .AMBARI-SERVER AND AMBARI-AGENT SAME VERSION 2.6.1.0. Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
BeforeAnyHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 29, in hook
setup_users()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 51, in setup_users
fetch_nonlocal_groups = params.fetch_nonlocal_groups,
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/accounts.py", line 84, in action_create
shell.checked_call(command, sudo=True)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'usermod -G nifi -g hadoop nifi' returned 6. usermod: user 'nifi' does not exist in /etc/passwd
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-2103.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-2103.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', ''
... View more
04-12-2018
08:11 AM
for zookeper-server what can Ido?
... View more
04-05-2018
05:54 AM
step1:.add this two property file into core-site.xml file. <property> <name>fs.s3a.access.key</name> <value>your aws IAM user access key</value> </property>
<property> <name>fs.s3a.secret.key</name> <value>your aws IAM user secret key</value> </property> step2: add s3 bucket endpoint property file into core-site.xml.before you add check s3 bucket region. for example my bucket in mumbai location:https://s3.ap-south1.amazonaws.com/bucketname/foldername/filename.csv
<property> <name>fs.s3a.endpoint</name> <value> s3,bucket.locatoon </value> s3.ap-south1.amazonaws.com </property> Note:otherwise you get 400 Bad Request WARN s3a.S3AFileSystem:Client: Amazon S3 error 400: 400 Bad Request; Bad Request com.amazonaws.services.s3.model.AmazonS3Exception: Bad Request (Service: Amazon S3; Status Code:400; Error Code:400 Bad Request; step 3.add hadoop.security.credential.provider.path property file into core-site.xml.for this use can add access.key and secret.key file on hdfs path(hadoop credential API to store AWS secrets.). example:these commands run as I: hdfs
hdfs dfs -chown s3_acces:hdfs /user/s3_access II: hadoop credential create fs.s3a.access.key -value aws-IAM-user_accesskey - / provider jceks://hdfs@10.22.121.0:8020/user/s3_access/s3.jceks. III:hadoop credential create fs.s3a.secret.key -value aws-IAM-user_secretkey -provider jceks://hdfs@10.22.121.0:8020/user/s3_access/s3.jceks IV. hadoop credential list -provider jceks://hdfs@10.22.121.0:8020/user/s3_access/s3.jceks you will get output as below: Listing aliases for CredentialProvider: jceks://hdfs@13.229.32.224:8020/user/s3_access/s3.jceks fs.s3a.secret.key fs.s3a.access.key finally you craeted store AWS secrets credential on hadoop] hdfs dfs -chowm s3_acces:hdfs /user/s3_access/s3.jceks hdfs dfs -chmod 666 /user/s3_access/s3.jceks <property> <name>hadoop.security.credential.provider.path</name> <value>jceks://hdfs@10.22.121.0:8020/user/s3_access/s3.jceks</value> </property> step 4:restart ambari-server: ambari-server restart
hadoop fs -ls s3a://yourbucketname/folder/file.csv hadoop distcp s3a://yourbucketname/foldername/filename.csv hdfs://10.22.121.0:8020/you hdfc folder flollow this link: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP2.6.2/bk_cloud-data-access/content/s3-config-props.html
... View more
12-22-2017
04:55 AM
i did same thing on HDP2.5 but still showing below error 534-5.7.14 IvciS8yws8APKDFHijkrFoJ92wCiJPAO_hwu5n84JrRGIqSyumJ05MoXUq94ogjYKFwfS5
534-5.7.14 iP5EQUiY-JheLLz82D33Qp2yF8wuQkoQ6BAwjeCpThA9GoTte7WBFbZnRByREw7MnpQ3Am
534-5.7.14 qRMD4_bLAgQqiJIzCgxU1ppqCt9qc> Please log in via your web browser and
534-5.7.14 then try again.
534-5.7.14 Learn more at
534 5.7.14 https://support.google.com/mail/answer/78754 e8sm22695829pga.72 - gsmtp
at com.sun.mail.smtp.SMTPTransport$Authenticator.authenticate(SMTPTransport.java:892)
at com.sun.mail.smtp.SMTPTransport.authenticate(SMTPTransport.java:814)
at com.sun.mail.smtp.SMTPTransport.protocolConnect(SMTPTransport.java:728)
at javax.mail.Service.connect(Service.java:386)
at javax.mail.Service.connect(Service.java:245)
at javax.mail.Service.connect(Service.java:194)
at javax.mail.Transport.send0(Transport.java:253)
at javax.mail.Transport.send(Transport.java:124)
at org.apache.ambari.server.notifications.dispatchers.EmailDispatcher.dispatch(EmailDispatcher.java:160)
at org.apache.ambari.server.notifications.DispatchRunnable.run(DispatchRunnable.java:58)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
... View more