Member since
08-03-2017
65
Posts
2
Kudos Received
0
Solutions
08-08-2018
08:45 PM
@amarnath reddy pappu Ok I resetted the server and agent on all nodes and i am on now hosts register page if i register and install again all services and start the cluster My data will be there right i want to make sure it doesnt wipe out.
... View more
08-08-2018
08:25 PM
If we reset ambari server and delete psotgres database and re install does it will delete whole cluster like namenode metadata and datanodes data. How it will work. Please much needed suggestios
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
08-07-2018
06:53 PM
@amarnath reddy pappuUpdated now to html page. Can you please take a look
... View more
08-07-2018
06:14 PM
stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py", line 35, in <module>
BeforeAnyHook().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 353, in execute
method(env)
File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py", line 29, in hook
setup_users()
File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/shared_initialization.py", line 50, in setup_users
groups = params.user_to_groups_dict[user],
KeyError: u'solr'
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-21877.json', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-21877.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1_2', '']Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stack-hooks/before-START/scripts/hook.py', 'START', '/var/lib/ambari-agent/data/command-21877.json', '/var/lib/ambari-agent/cache/stack-hooks/before-START', '/var/lib/ambari-agent/data/structured-out-21877.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1_2', '']
stdout:
2018-08-07 11:10:11,384 - Stack Feature Version Info: Cluster Stack=2.5, Command Stack=None, Command Version=2.5.6.0 -> 2.5.6.0
2018-08-07 11:10:11,401 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-08-07 11:10:11,589 - Stack Feature Version Info: Cluster Stack=2.5, Command Stack=None, Command Version=2.5.6.0 -> 2.5.6.0
2018-08-07 11:10:11,593 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-08-07 11:10:11,595 - Group['livy'] {}
2018-08-07 11:10:11,596 - Group['spark'] {}
2018-08-07 11:10:11,596 - Group['solr'] {}
2018-08-07 11:10:11,596 - Group['ranger'] {}
2018-08-07 11:10:11,596 - Group['hdfs'] {}
2018-08-07 11:10:11,597 - Group['zeppelin'] {}
2018-08-07 11:10:11,597 - Group['hadoop'] {}
2018-08-07 11:10:11,597 - Group['users'] {}
2018-08-07 11:10:11,597 - Group['knox'] {}
2018-08-07 11:10:11,598 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-07 11:10:11,599 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-07 11:10:11,600 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-07 11:10:11,601 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-07 11:10:11,602 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-07 11:10:11,603 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2018-08-07 11:10:11,604 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-08-07 11:10:11,605 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-21877.json', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-21877.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1_2', '']
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stack-hooks/before-START/scripts/hook.py', 'START', '/var/lib/ambari-agent/data/command-21877.json', '/var/lib/ambari-agent/cache/stack-hooks/before-START', '/var/lib/ambari-agent/data/structured-out-21877.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1_2', '']
Command failed after 1 tries
... View more
Labels:
08-03-2018
06:41 PM
@Jay Kumar SenSharma Never Mind. Changed password for ambari Keytab and restarted ambari. its up and running now. Thank you.
... View more
08-03-2018
05:11 PM
@Jay Kumar SenSharma This is the one error getting when generating keytabs. KVNO Timestamp Principal
---- ------------------- ------------------------------------------------------
ambari-server-ker@HDP.COM
[root@hdp /]# kinit -kt /etc/security/keytabs/ambari.server.keytab ambari-server-ker@HDP.COM
kinit: Password incorrect while getting initial credentials
... View more
08-02-2018
09:38 PM
2018-08-02 14:32:51,917 ERROR [main] KerberosChecker:120 - Checksum failed
2018-08-02 14:32:51,918 ERROR [main] AmbariServer:1111 - Failed to run the Ambari Server
org.apache.ambari.server.AmbariException: Ambari Server Kerberos credentials check failed.
Check KDC availability and JAAS configuration in /etc/ambari-server/conf/krb5JAASLogin.conf
at org.apache.ambari.server.controller.utilities.KerberosChecker.checkJaasConfiguration(KerberosChecker.java:121)
at org.apache.ambari.server.controller.AmbariServer.main(AmbariServer.java:1102)
... View more
Labels:
- Labels:
-
Apache Ambari
06-19-2018
06:55 PM
Upgrading from 2.4 to 2.5 getting this error at ambari-server Upgrade. [root@hdp ~]# ambari-server upgrade
Using python /usr/bin/python
Upgrading ambari-server
INFO: Upgrade Ambari Server
INFO: Updating Ambari Server properties in ambari.properties ...
WARNING: Can not find ambari.properties.rpmsave file from previous version, skipping import of settings
INFO: Updating Ambari Server properties in ambari-env.sh ...
INFO: Can not find ambari-env.sh.rpmsave file from previous version, skipping restore of environment settings. ambari-env.sh may not include any user customization.
ERROR: Unexpected OSError: [Errno 17] File exists
For more info run ambari-server with -v or --verbose option
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
06-19-2018
04:22 PM
I am Getting this issue when upgrading 2.4 to 2.6. Ambari Server configured for Embedded Postgres. Confirm you have made a backup of the Ambari Server database [y/n] (y)? y
INFO: Upgrading database schema
INFO: Return code from schema upgrade command, retcode = 1
ERROR: Error executing schema upgrade, please check the server logs.
ERROR: Error output from schema upgrade command:
ERROR: Exception in thread "main" org.apache.ambari.server.AmbariException: Unable to find any CURRENT repositories.
at org.apache.ambari.server.upgrade.SchemaUpgradeHelper.executeUpgrade(SchemaUpgradeHelper.java:203)
at org.apache.ambari.server.upgrade.SchemaUpgradeHelper.main(SchemaUpgradeHelper.java:418)
Caused by: org.apache.ambari.server.AmbariException: Unable to find any CURRENT repositories.
at org.apache.ambari.server.upgrade.UpgradeCatalog260.getCurrentVersionID(UpgradeCatalog260.java:510)
at org.apache.ambari.server.upgrade.UpgradeCatalog260.executeDDLUpdates(UpgradeCatalog260.java:194)
at org.apache.ambari.server.upgrade.AbstractUpgradeCatalog.upgradeSchema(AbstractUpgradeCatalog.java:923)
at org.apache.ambari.server.upgrade.SchemaUpgradeHelper.executeUpgrade(SchemaUpgradeHelper.java:200)
... 1 more
ERROR: Ambari server upgrade failed. Please look at /var/log/ambari-server/ambari-server.log, for more details.
ERROR: Exiting with exit code 11.
REASON: Schema upgrade failed.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
03-26-2018
09:04 PM
Hi, I need Some help with Increasing HDFS disk Space. I have a Disk Mounted to HDFS which is 500 GB. Now its filled to 94% then we increased same disk to 500 to 650 GB. Then I restarted the VM i am able to see in Lsblk Command but not in HDFS Space. I Believe I don't need to Mount it again because that is already mounted in HDFS Direcotry it suppose to be pick it up when we restart right ?
... View more
Labels:
- Labels:
-
Apache Hadoop
03-12-2018
06:09 PM
@rtrivedi I did same steps both ways but still getting issue. But when i turn off hive server 2 authorization i am able to access but when it turn it on i am not able to access generated a keytab in Default too for Hive,
... View more
03-12-2018
05:36 PM
@Jay Kumar SenSharma Still I am having issues on this connectivity [root@hostnmae~]#beeline -u jdbc:hive2://serverip:10000 user pass
Connecting tojdbc:hive2://server ip:10000
18/03/12 10:14:36[main]: WARN jdbc.HiveConnection: Failed to connect to serverip:10000
Error: Could notopen client transport with JDBC Uri: jdbc:hive2://serverip:10000: Peerindicated failure: Unsupported mechanism type PLAIN (state=08S01,code=0)
Beeline version1.2.1000.2.5.6.0-40 by Apache Hive
0:jdbc:hive2://serverip:10000 (closed)> !quit
... View more
03-12-2018
05:19 PM
I am Getting this issues when testing from command line [root@hostnmae~]#
beeline -u jdbc:hive2://serverip:10000 user pass
Connecting to
jdbc:hive2://server ip:10000
18/03/12 10:14:36
[main]: WARN jdbc.HiveConnection: Failed to connect to serverip:10000
Error: Could not
open client transport with JDBC Uri: jdbc:hive2://serverip:10000: Peer
indicated failure: Unsupported mechanism type PLAIN (state=08S01,code=0)
Beeline version
1.2.1000.2.5.6.0-40 by Apache Hive
0:
jdbc:hive2://serverip:10000 (closed)> !quit
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
03-09-2018
11:03 PM
After Kerberos regenerated from ambari view not able to run this code Connection
cnct1 = DriverManager.getConnection("jdbc:hive2://serverip:10000/test","user","test") HiveConnection:Could not open
client transport with JDBC Uri: jdbc:hive2://serverIp:10000/test: Peer
indicated failure: Unsupported mechanism type PLAIN
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
01-22-2018
06:36 PM
Hello, I Have root Priviligies in Nifi and able to see all Nifi Properties But not access WEB UI. NO LDAP is Configured But in Place with kerberos.
... View more
Labels:
12-05-2017
06:22 PM
1 Kudo
HI Everyone, I Need to automate to copy FTP files from a drive to HDFS. OS : Centos 7 HDP : 2.5 ambari :2.4 Please Guide me how to copy or to automate files from FTP to HDFS or to local Directory.
... View more
Labels:
10-11-2017
04:33 PM
@Geoffrey Shelton Okot No log file files-view log is empty Service 'hdfs' check failed:
java.lang.NullPointerException
at org.apache.hadoop.security.authentication.util.KerberosName.getShortName(KerberosName.java:383)
at org.apache.hadoop.security.User.<init>(User.java:48)
at org.apache.hadoop.security.User.<init>(User.java:43)
at org.apache.hadoop.security.UserGroupInformation.createRemoteUser(UserGroupInformation.java:1270)
at org.apache.hadoop.security.UserGroupInformation.createRemoteUser(UserGroupInformation.java:1254)
at org.apache.ambari.view.utils.hdfs.HdfsApi.getProxyUser(HdfsApi.java:78)
at org.apache.ambari.view.utils.hdfs.HdfsApi.<init>(HdfsApi.java:66)
at org.apache.ambari.view.utils.hdfs.HdfsUtil.connectToHDFSApi(HdfsUtil.java:127)
at org.apache.ambari.view.commons.hdfs.HdfsService.hdfsSmokeTest(HdfsService.java:136)
at org.apache.ambari.view.filebrowser.HelpService.hdfsStatus(HelpService.java:86)
at sun.reflect.GeneratedMethodAccessor726.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
... View more
10-11-2017
04:26 PM
@Aditya Sirna Advanced core site RULE:[1:$1@$0](ambari-XZ-bigdata@RELAY.COM)s/.*/ambari-qa/
RULE:[1:$1@$0](hbase-bigdata@RELAY.COM)s/.*/hbase/
RULE:[1:$1@$0](hdfs-bigdata@RELAY.COM)s/.*/hdfs/
RULE:[1:$1@$0](spark-bigdata@RELAY.COM)s/.*/spark/
RULE:[1:$1@$0](zeppelin-bigdata@RELAY.COM)s/.*/zeppelin/
RULE:[1:$1@$0](.*@RELAY .COM)s/@.*//
RULE:[2:$1@$0](amshbase@RELAY.COM)s/.*/ams/
RULE:[2:$1@$0](amshbase@RELAY.COM)s/.*/hbase/
RULE:[2:$1@$0](amszk@RELAY.COM)s/.*/ams/
RULE:[2:$1@$0](atlas@RELAY.COM)s/.*/atlas/
RULE:[2:$1@$0](dn@RELAY.COM)s/.*/hdfs/
RULE:[2:$1@$0](falcon@RELAY.COM)s/.*/falcon/
RULE:[2:$1@$0](hbase@RELAY.COM)s/.*/hbase/
RULE:[2:$1@$0](hive@RELAY.COM)s/.*/hive/
RULE:[2:$1@$0](jhs@RELAY.COM)s/.*/mapred/
RULE:[2:$1@$0](jn@RELAY.COM)s/.*/hdfs/
RULE:[2:$1@$0](knox@RELAY.COM)s/.*/knox/
RULE:[2:$1@$0](livy@RELAY.COM)s/.*/livy/
RULE:[2:$1@$0](nfs@RELAY.COM)s/.*/hdfs/
RULE:[2:$1@$0](nm@RELAY.COM)s/.*/yarn/
RULE:[2:$1@$0](nn@RELAY.COM)s/.*/hdfs/
RULE:[2:$1@$0](oozie@RELAY.COM)s/.*/oozie/
RULE:[2:$1@$0](rangeradmin@RELAY.COM)s/.*/ranger/
RULE:[2:$1@$0](rangertagsync@RELAY.COM)s/.*/rangertagsync/
RULE:[2:$1@$0](rangerusersync@RELAY.COM)s/.*/rangerusersync/
RULE:[2:$1@$0](rm@RELAY.COM)s/.*/yarn/
RULE:[2:$1@$0](yarn@RELAY.COM)s/.*/yarn/
DEFAULT
... View more
10-11-2017
04:08 PM
@Geoffrey Shelton Okot I followed same steps but still issue is same
... View more
10-10-2017
09:36 PM
Not able to open Hive view and Files View. Service 'hdfs' check failed:
java.lang.NullPointerException
at org.apache.hadoop.security.authentication.util.KerberosName.getShortName(KerberosName.java:383)
at org.apache.hadoop.security.User.<init>(User.java:48)
at org.apache.hadoop.security.User.<init>(User.java:43)
at org.apache.hadoop.security.UserGroupInformation.createRemoteUser(UserGroupInformation.java:1270)
at org.apache.hadoop.security.UserGroupInformation.createRemoteUser(UserGroupInformation.java:1254)
at org.apache.ambari.view.utils.hdfs.HdfsApi.getProxyUser(HdfsApi.java:78)
at org.apache.ambari.view.utils.hdfs.HdfsApi.<init>(HdfsApi.java:66)
at org.apache.ambari.view.utils.hdfs.HdfsUtil.connectToHDFSApi(HdfsUtil.java:127)
at org.apache.ambari.view.commons.hdfs.HdfsService.hdfsSmokeTest(HdfsService.java:136)
at org.apache.ambari.view.filebrowser.HelpService.hdfsStatus(HelpService.java:86)
at sun.reflect.GeneratedMethodAccessor726.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
10-06-2017
03:12 PM
Hi I am getting this error almost in all services installed can someone help us Connection failed to http://hostname:50070 (Execution of '/usr/bin/kinit -c /var/lib/ambari-agent/tmp/curl_krb_cache/web_alert_ambari-qa_cc_196393db8ad8461dac739b8ea56294c7 -kt /etc/security/keytabs/spnego.service.keytab HTTP/hostname@RELAY.COM > /dev/null' returned 1. kinit: Keytab contains no suitable keys for HTTP/hostname@RELAY.COM while getting initial credentials)
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
09-11-2017
10:41 PM
@Swapan Shridhar The variance for this alert is 2,240,642,366B which is 25% of the 8,925,205,907B average (1,785,041,181B is the limit)
... View more
09-11-2017
09:03 PM
Hi, I am getting this error from both Active and StandBy node. No Under Replicated Blocks No failed Disk Volumes. This service-level alert is triggered if the increase in storage capacity usage deviation has grown beyond the specified threshold within a week period.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
09-07-2017
09:57 PM
After Enabling Kerberos I am not able to access some of the WEB UI's in Ambari: Any Suggestions Please solr:
HTTP ERROR 403
Problem accessing /solr/. Reason:
GSSException: Defective token detected (Mechanism level: GSSHeader did not find the right tag)
falcon : HTTP ERROR: 403
Problem accessing /. Reason:
org.apache.hadoop.security.authentication.client.AuthenticationException: GSSException: Defective token detected (Mechanism level: GSSHeader did not find the right tag) oozie: HTTP Status 403 - org.apache.hadoop.security.authentication.client.AuthenticationException: GSSException: Defective token detected (Mechanism level: GSSHeader did not find the right tag)
type Status report
message org.apache.hadoop.security.authentication.client.AuthenticationException: GSSException: Defective token detected (Mechanism level: GSSHeader did not find the right tag)
description Access to the specified resource has been forbidden.
Apache Tomcat/6.0.48
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
-
Apache Oozie
09-07-2017
07:53 PM
@Geoffrey Shelton Okot Is there any way to add user in all hosts ? Please let me know
... View more
09-07-2017
06:52 PM
@Geoffrey Shelton Okot Thank you again. these is the issue i am getting from beeline. beeline> !connect jdbc:hive2://hostname.host.com:2181,hostname.host.com:2181,hostname.host.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2 username password
Connected to: Apache Hive (version 1.2.1000.2.5.6.0-40)
Driver: Hive JDBC (version 1.2.1000.2.5.6.0-40)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://host> select max(_TIMESTAMP(ts)) ;
INFO : Tez session hasn't been created yet. Opening session
ERROR : Failed to execute tez graph.
org.apache.tez.dag.api.SessionNotRunning: TezSession has already shutdown. Application application_ failed 2 times due to AM Container for appattempt_ exited with exitCode: -1000
Diagnostics: Application application_ID initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is berlin
main : requested yarn user is berlin
User berlin not found
Failing this attempt. Failing the application.
at org.apache.tez.client.TezClient.waitTillReady(TezClient.java:779)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:217)
at org.apache.hadoop.hive.ql.exec.tez.TezTask.updateSession(TezTask.java:287)
at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:166)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1745)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1491)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1289)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1156)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1151)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:197)
at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:76)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:253)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1865)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:264)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask (state=08S01,code=1)
... View more
09-07-2017
03:29 PM
I just created users in /home in edge node for users to work on hadoop. But users not able to run hive/Yarn jobs. My cluster is already kerberized. Is there any way to assign users to submit their jobs in edge node. Do we need to these users in hadoop/hdfs ? If cluster has kerberos and ranger is there any different way to do it ? Please suggest me.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
09-06-2017
09:14 PM
@Geoffrey Shelton Okot Perfect Thank You.
... View more