Member since
04-15-2017
9
Posts
0
Kudos Received
0
Solutions
06-05-2018
06:08 AM
For Example default container size is 8GB, any one of the query required 10GB... is there any command that will automatically provide the memory required by the job, that we can set instead of increasing hive.tez.container.size
... View more
Labels:
03-25-2018
07:16 AM
@Venkata Sudheer Kumar M Same Error fixed by setting below property in spark interpreter configs. it is a know issue -- ZEPPELIN-1263 Description of Problem: spark.driver.memory will not take effect, the driver memory is always 1G.
Workaround: To change the driver memory, specify it in the SPARK_DRIVER_MEMORY property on the interpreter setting page for your spark interpreter. Refer - https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_release-notes/content/known_issues.html BUG 78035 Thanks, Sajid.
... View more
10-15-2017
12:43 PM
2017-10-15 07:39:28,658 ERROR [main] master.TableLockManager: Unexpected ZooKeeper error when listing children
org.apache.zookeeper.KeeperException$NoAuthException: KeeperErrorCode = NoAuth for /hbase-secure/table-lock
at org.apache.zookeeper.KeeperException.create(KeeperException.java:113)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
at org.apache.zookeeper.ZooKeeper.getChildren(ZooKeeper.java:1472)
at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.getChildren(RecoverableZooKeeper.java:295)
at org.apache.hadoop.hbase.zookeeper.ZKUtil.listChildrenNoWatch(ZKUtil.java:512)
at org.apache.hadoop.hbase.master.TableLockManager$ZKTableLockManager.getTableNames(TableLockManager.java:392)
at org.apache.hadoop.hbase.master.TableLockManager$ZKTableLockManager.visitAllLocks(TableLockManager.java:379)
at org.apache.hadoop.hbase.util.hbck.TableLockChecker.checkTableLocks(TableLockChecker.java:78)
at org.apache.hadoop.hbase.util.HBaseFsck.checkAndFixTableLocks(HBaseFsck.java:3345)
at org.apache.hadoop.hbase.util.HBaseFsck.onlineHbck(HBaseFsck.java:779)
at org.apache.hadoop.hbase.util.HBaseFsck.exec(HBaseFsck.java:4857)
at org.apache.hadoop.hbase.util.HBaseFsck$HBaseFsckTool.run(HBaseFsck.java:4657)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.hadoop.hbase.util.HBaseFsck.main(HBaseFsck.java:4645)
2017-10-15 07:39:28,986 INFO [main] util.HBaseFsck: Finishing hbck
2017-10-15 07:39:28,999 INFO [main] zookeeper.ZooKeeper: Session: 0x25eb3954486f304 closed
2017-10-15 07:39:28,999 INFO [main] client.ConnectionManager$HConnectionImplementation: Closing master protocol: MasterService
2017-10-15 07:39:28,999 INFO [main-EventThread] zookeeper.ClientCnxn: EventThread shut down
2017-10-15 07:39:28,999 INFO [main] client.ConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x35e72340bdb1308
2017-10-15 07:39:29,012 INFO [main] zookeeper.ZooKeeper: Session: 0x35e72340bdb1308 closed
2017-10-15 07:39:29,012 INFO [main-EventThread] zookeeper.ClientCnxn: EventThread shut down
Exception in thread "main" java.io.IOException: Unexpected ZooKeeper exception
at org.apache.hadoop.hbase.master.TableLockManager$ZKTableLockManager.getTableNames(TableLockManager.java:395)
at org.apache.hadoop.hbase.master.TableLockManager$ZKTableLockManager.visitAllLocks(TableLockManager.java:379)
at org.apache.hadoop.hbase.util.hbck.TableLockChecker.checkTableLocks(TableLockChecker.java:78)
at org.apache.hadoop.hbase.util.HBaseFsck.checkAndFixTableLocks(HBaseFsck.java:3345)
at org.apache.hadoop.hbase.util.HBaseFsck.onlineHbck(HBaseFsck.java:779)
at org.apache.hadoop.hbase.util.HBaseFsck.exec(HBaseFsck.java:4857)
at org.apache.hadoop.hbase.util.HBaseFsck$HBaseFsckTool.run(HBaseFsck.java:4657)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.hadoop.hbase.util.HBaseFsck.main(HBaseFsck.java:4645)
Caused by: org.apache.zookeeper.KeeperException$NoAuthException: KeeperErrorCode = NoAuth for /hbase-secure/table-lock
at org.apache.zookeeper.KeeperException.create(KeeperException.java:113)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
at org.apache.zookeeper.ZooKeeper.getChildren(ZooKeeper.java:1472)
at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.getChildren(RecoverableZooKeeper.java:295)
at org.apache.hadoop.hbase.zookeeper.ZKUtil.listChildrenNoWatch(ZKUtil.java:512)
at org.apache.hadoop.hbase.master.TableLockManager$ZKTableLockManager.getTableNames(TableLockManager.java:392)
... 9 more
... View more
Labels:
05-12-2017
07:09 AM
@Jay SenSharma 1) Ambari Version - 2.4.2.0 output of the command is 2) didn't changed anything from Ambari UI. 3) in All hosts getting same error.
... View more
05-12-2017
12:33 AM
error getting while installing Hcat client Any thoughts ? Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hcat_client.py", line 85, in <module>
HCatClient().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hcat_client.py", line 36, in install
self.configure(env)
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hcat_client.py", line 41, in configure
hcat()
File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
return fn(*args, **kwargs)
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hcat.py", line 75, in hcat
content=InlineTemplate(params.hcat_env_sh_template)
File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 148, in __init__
super(InlineTemplate, self).__init__(name, extra_imports, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 135, in __init__
self.template = self.template_env.get_template(self.name)
File "/usr/lib/python2.6/site-packages/ambari_jinja2/environment.py", line 716, in get_template
return self._load_template(name, self.make_globals(globals))
File "/usr/lib/python2.6/site-packages/ambari_jinja2/environment.py", line 690, in _load_template
template = self.loader.load(self, name, globals)
File "/usr/lib/python2.6/site-packages/ambari_jinja2/loaders.py", line 127, in load
code = environment.compile(source, name, filename)
File "/usr/lib/python2.6/site-packages/ambari_jinja2/environment.py", line 492, in compile
self.handle_exception(exc_info, source_hint=source)
File "<unknown>", line 26, in template
ambari_jinja2.exceptions.TemplateSyntaxError: unexpected '}'
... View more
05-10-2017
12:39 AM
Hi All, I am following given article to make connection to hive in hdp 2.4.2 https://community.hortonworks.com/articles/1887/connect-oracle-sql-developer-to-hive.html https://db-blog.web.cern.ch/blog/prasanth-kothuri/2016-02-using-sql-developer-access-apache-hive-kerberos-authentication I m getting this error -- Status : Failure -Test
failed: [Cloudera][HiveJDBCDriver](500164) Error initialized or created
transport for authentication:
CONN_KERBEROS_AUTHENTICATION_ERROR_GET_TICKETCACHE I also have one doubt that to make a connection to hdp do we really need cloudera hive driver ? Please help me to resolved it.
... View more
Labels:
05-09-2017
07:17 AM
@mqureshi I just checked in both the nodes, all hive and hbase jars are the same, still didnt get why it is failed.
... View more
05-09-2017
05:54 AM
@mqureshi Created Internal table in Hive using HBaseSerDe, HBaseStorageHandler using properties like hbase.table.name, hbase.mapred.output.outputtable, or hbase.column.mapping And i have correct jar files for HBase and Hive version. i am using HDP 2.4.2.60-1. the same query working on one edge node, in other edge node it is giving above error.
... View more
05-09-2017
03:27 AM
when executing query select * from table_name limit 10; getting below exception Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.security.token.TokenUtil.addTokenForJob(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/security/User;Lorg/apache/hadoop/mapreduce/Job;)V
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.addHBaseDelegationToken(HBaseStorageHandler.java:482)
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.configureTableJobProperties(HBaseStorageHandler.java:427)
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.configureInputJobProperties(HBaseStorageHandler.java:328)
at org.apache.hadoop.hive.ql.plan.PlanUtils.configureJobPropertiesForStorageHandler(PlanUtils.java:817)
at org.apache.hadoop.hive.ql.plan.PlanUtils.configureInputJobPropertiesForStorageHandler(PlanUtils.java:787)
at org.apache.hadoop.hive.ql.optimizer.SimpleFetchOptimizer$FetchData.convertToWork(SimpleFetchOptimizer.java:385)
at org.apache.hadoop.hive.ql.optimizer.SimpleFetchOptimizer$FetchData.access$000(SimpleFetchOptimizer.java:323)
at org.apache.hadoop.hive.ql.optimizer.SimpleFetchOptimizer.optimize(SimpleFetchOptimizer.java:134)
at org.apache.hadoop.hive.ql.optimizer.SimpleFetchOptimizer.transform(SimpleFetchOptimizer.java:105)
at org.apache.hadoop.hive.ql.optimizer.Optimizer.optimize(Optimizer.java:205)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10198)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:211)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:227)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:459)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:316)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1189)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1237)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1126)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1116)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:168)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:379)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:739)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Need Help to Solve this.
... View more
Labels: