- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Hive webUrl issue
- Labels:
-
Apache Hive
Created 08-11-2021 05:37 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Community,
I am trying to open the Hive weburi and getting below error
HTTP ERROR 403
Problem accessing /hiveserver2.jsp. Reason:
java.lang.IllegalArgumentException
Weburi is configured in system with all required parameters. pls advice.
Thanks
Amit
Created 08-12-2021 01:46 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @amitshanker
Could you let me know whether you are using CDH/HDP/CDP?
What is the document which you had followed to enable HS2 UI?
Can you share me the screenshot of the error you are facing
Do you have kerberos enabled in your cluster?
Created 08-12-2021 09:37 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for the details.
Can you share the hive-site.xml file
Can you share me the complete screenshot of the error you are facing
Do you have kerberos enabled in your cluster?
Created 08-13-2021 04:50 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @amitshanker
Thanks for the update.
I can see than you had set hive.server2.webui.use.spnego as true, that means kerberos is enabled in your cluster.
If spnego and kerberos are enabled then few settings needs to be changed in browser. Could you please follow the below link.
Created 08-12-2021 01:46 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @amitshanker
Could you let me know whether you are using CDH/HDP/CDP?
What is the document which you had followed to enable HS2 UI?
Can you share me the screenshot of the error you are facing
Do you have kerberos enabled in your cluster?
Created 08-12-2021 04:49 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It is CDH and fallowing cloudera document https://docs.cloudera.com/documentation/enterprise/6/6.2/topics/cm_mc_hive_webui.html
Created 08-12-2021 09:37 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for the details.
Can you share the hive-site.xml file
Can you share me the complete screenshot of the error you are facing
Do you have kerberos enabled in your cluster?
Created 08-12-2021 10:36 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
<?xml version="1.0" encoding="UTF-8"?>
<!--Autogenerated by Cloudera Manager-->
<configuration>
<property>
<name>hive.metastore.uris</name>
<value>thrift://xxxxxxxxx:9083</value>
</property>
<property>
<name>hive.metastore.client.socket.timeout</name>
<value>300</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/user/hive/warehouse</value>
</property>
<property>
<name>hive.warehouse.subdir.inherit.perms</name>
<value>true</value>
</property>
<property>
<name>spark.master</name>
<value>yarn</value>
</property>
<property>
<name>spark.submit.deployMode</name>
<value>cluster</value>
</property>
<property>
<name>hive.log.explain.output</name>
<value>false</value>
</property>
<property>
<name>hive.auto.convert.join</name>
<value>true</value>
</property>
<property>
<name>hive.auto.convert.join.noconditionaltask.size</name>
<value>20971520</value>
</property>
<property>
<name>hive.optimize.index.filter</name>
<value>true</value>
</property>
<property>
<name>hive.optimize.bucketmapjoin.sortedmerge</name>
<value>false</value>
</property>
<property>
<name>hive.smbjoin.cache.rows</name>
<value>10000</value>
</property>
<property>
<name>hive.server2.logging.operation.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.server2.logging.operation.log.location</name>
<value>/var/log/hive/operation_logs</value>
</property>
<property>
<name>mapred.reduce.tasks</name>
<value>-1</value>
</property>
<property>
<name>hive.exec.reducers.bytes.per.reducer</name>
<value>67108864</value>
</property>
<property>
<name>hive.exec.copyfile.maxsize</name>
<value>33554432</value>
</property>
<property>
<name>hive.exec.reducers.max</name>
<value>1099</value>
</property>
<property>
<name>hive.vectorized.groupby.checkinterval</name>
<value>4096</value>
</property>
<property>
<name>hive.vectorized.groupby.flush.percent</name>
<value>0.1</value>
</property>
<property>
<name>hive.compute.query.using.stats</name>
<value>false</value>
</property>
<property>
<name>hive.vectorized.execution.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.vectorized.execution.reduce.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.vectorized.use.vectorized.input.format</name>
<value>true</value>
</property>
<property>
<name>hive.vectorized.use.checked.expressions</name>
<value>true</value>
</property>
<property>
<name>hive.vectorized.use.vector.serde.deserialize</name>
<value>false</value>
</property>
<property>
<name>hive.vectorized.adaptor.usage.mode</name>
<value>chosen</value>
</property>
<property>
<name>hive.vectorized.input.format.excludes</name>
<value>org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat</value>
</property>
<property>
<name>hive.merge.mapfiles</name>
<value>true</value>
</property>
<property>
<name>hive.merge.mapredfiles</name>
<value>false</value>
</property>
<property>
<name>hive.cbo.enable</name>
<value>false</value>
</property>
<property>
<name>hive.fetch.task.conversion</name>
<value>minimal</value>
</property>
<property>
<name>hive.fetch.task.conversion.threshold</name>
<value>268435456</value>
</property>
<property>
<name>hive.limit.pushdown.memory.usage</name>
<value>0.1</value>
</property>
<property>
<name>hive.merge.sparkfiles</name>
<value>true</value>
</property>
<property>
<name>hive.merge.smallfiles.avgsize</name>
<value>16777216</value>
</property>
<property>
<name>hive.merge.size.per.task</name>
<value>268435456</value>
</property>
<property>
<name>hive.optimize.reducededuplication</name>
<value>true</value>
</property>
<property>
<name>hive.optimize.reducededuplication.min.reducer</name>
<value>4</value>
</property>
<property>
<name>hive.map.aggr</name>
<value>true</value>
</property>
<property>
<name>hive.map.aggr.hash.percentmemory</name>
<value>0.5</value>
</property>
<property>
<name>hive.optimize.sort.dynamic.partition</name>
<value>false</value>
</property>
<property>
<name>hive.execution.engine</name>
<value>mr</value>
</property>
<property>
<name>spark.executor.memory</name>
<value>9442767667b</value>
</property>
<property>
<name>spark.driver.memory</name>
<value>11596411699b</value>
</property>
<property>
<name>spark.executor.cores</name>
<value>4</value>
</property>
<property>
<name>spark.yarn.driver.memoryOverhead</name>
<value>1228m</value>
</property>
<property>
<name>spark.yarn.executor.memoryOverhead</name>
<value>1589m</value>
</property>
<property>
<name>spark.dynamicAllocation.enabled</name>
<value>true</value>
</property>
<property>
<name>spark.dynamicAllocation.initialExecutors</name>
<value>1</value>
</property>
<property>
<name>spark.dynamicAllocation.minExecutors</name>
<value>1</value>
</property>
<property>
<name>spark.dynamicAllocation.maxExecutors</name>
<value>2147483647</value>
</property>
<property>
<name>hive.stats.fetch.column.stats</name>
<value>true</value>
</property>
<property>
<name>hive.mv.files.thread</name>
<value>15</value>
</property>
<property>
<name>hive.blobstore.use.blobstore.as.scratchdir</name>
<value>false</value>
</property>
<property>
<name>hive.load.dynamic.partitions.thread</name>
<value>15</value>
</property>
<property>
<name>hive.exec.input.listing.max.threads</name>
<value>15</value>
</property>
<property>
<name>hive.msck.repair.batch.size</name>
<value>0</value>
</property>
<property>
<name>hive.spark.dynamic.partition.pruning.map.join.only</name>
<value>false</value>
</property>
<property>
<name>hive.metastore.execute.setugi</name>
<value>true</value>
</property>
<property>
<name>hive.support.concurrency</name>
<value>true</value>
</property>
<property>
<name>hive.zookeeper.quorum</name>
<value>xxxxxx.net,xxxxx.net,x.net,xxxxx.net</value>
</property>
<property>
<name>hive.zookeeper.client.port</name>
<value>2181</value>
</property>
<property>
<name>hive.zookeeper.namespace</name>
<value>hive_zookeeper_namespace_Hive</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value>xxxxxx.net,xxxxx.net,x.net,xxxxx.net </value>
</property>
<property>
<name>hbase.zookeeper.property.clientPort</name>
<value>2181</value>
</property>
<property>
<name>hive.cluster.delegation.token.store.class</name>
<value>org.apache.hadoop.hive.thrift.MemoryTokenStore</value>
</property>
<property>
<name>hive.metastore.fshandler.threads</name>
<value>15</value>
</property>
<property>
<name>hive.driver.parallel.compilation</name>
<value>false</value>
</property>
<property>
<name>hive.driver.parallel.compilation.global.limit</name>
<value>3</value>
</property>
<property>
<name>hive.server2.thrift.min.worker.threads</name>
<value>5</value>
</property>
<property>
<name>hive.server2.thrift.max.worker.threads</name>
<value>100</value>
</property>
<property>
<name>hive.server2.thrift.port</name>
<value>10000</value>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>false</value>
</property>
<property>
<name>yarn.scheduler.fair.allocation.file</name>
<value>{{CMF_CONF_DIR}}/fair-scheduler.xml</value>
</property>
<property>
<name>hive.server2.session.check.interval</name>
<value>900000</value>
</property>
<property>
<name>hive.server2.idle.session.timeout</name>
<value>43200000</value>
</property>
<property>
<name>hive.server2.idle.session.check.operation</name>
<value>true</value>
</property>
<property>
<name>hive.server2.idle.operation.timeout</name>
<value>21600000</value>
</property>
<property>
<name>hive.server2.webui.host</name>
<value>0.0.0.0</value>
</property>
<property>
<name>hive.server2.webui.port</name>
<value>10002</value>
</property>
<property>
<name>hive.server2.webui.max.threads</name>
<value>50</value>
</property>
<property>
<name>hive.server2.webui.use.ssl</name>
<value>true</value>
</property>
<property>
<name>hive.server2.webui.keystore.path</name>
<value>/opt/cloudera/security/jks/keystore.jks</value>
</property>
<property>
<name>hive.server2.webui.keystore.password</name>
<value>********</value>
</property>
<property>
<name>hive.aux.jars.path</name>
<value>{{HIVE_HBASE_JAR}}</value>
</property>
<property>
<name>hive.metastore.sasl.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.server2.authentication</name>
<value>kerberos</value>
</property>
<property>
<name>hive.metastore.kerberos.principal</name>
<value>hive/_HOST@xxxxxx.net</value>
</property>
<property>
<name>hive.server2.authentication.kerberos.principal</name>
<value>hive/_HOST@xxxxxx.net</value>
</property>
<property>
<name>hive.server2.authentication.kerberos.keytab</name>
<value>hive.keytab</value>
</property>
<property>
<name>hive.server2.webui.use.spnego</name>
<value>true</value>
</property>
<property>
<name>hive.server2.webui.spnego.keytab</name>
<value>hive.keytab</value>
</property>
<property>
<name>hive.server2.webui.spnego.principal</name>
<value>HTTP/xxxxxx.net@xxxxxx.net</value>
</property>
<property>
<name>hive.server2.use.SSL</name>
<value>true</value>
</property>
<property>
<name>hive.server2.keystore.path</name>
<value>/opt/cloudera/security/jks/keystore.jks</value>
</property>
<property>
<name>hive.server2.keystore.password</name>
<value>********</value>
</property>
<property>
<name>cloudera.navigator.client.config</name>
<value>{{CMF_CONF_DIR}}/navigator.client.properties</value>
</property>
<!--'hive.metastore.event.listeners', originally set to 'com.cloudera.navigator.audit.hive.HiveMetaStoreEventListener' (non-final), is overridden below by a safety valve-->
<property>
<name>hive.server2.session.hook</name>
<value>org.apache.sentry.binding.hive.HiveAuthzBindingSessionHook</value>
</property>
<property>
<name>hive.sentry.conf.url</name>
<value>file:///{{CMF_CONF_DIR}}/sentry-site.xml</value>
</property>
<property>
<name>hive.metastore.filter.hook</name>
<value>org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook</value>
</property>
<property>
<name>hive.stats.collect.scancols</name>
<value>true</value>
</property>
<property>
<name>hive.exec.post.hooks</name>
<value>com.cloudera.navigator.audit.hive.HiveExecHookContext,org.apache.hadoop.hive.ql.hooks.LineageLogger</value>
</property>
<property>
<name>hive.security.authorization.task.factory</name>
<value>org.apache.sentry.binding.hive.SentryHiveAuthorizationTaskFactoryImpl</value>
</property>
<property>
<name>spark.shuffle.service.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.query.redaction.rules</name>
<value>{{CMF_CONF_DIR}}/redaction-rules.json</value>
</property>
<property>
<name>hive.exec.query.redactor.hooks</name>
<value>org.cloudera.hadoop.hive.ql.hooks.QueryRedactor</value>
</property>
<property>
<name>hive.security.authorization.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.security.authorization.manager</name>
<value>org.apache.sentry.binding.hive.authz.SentryHiveAuthorizerFactory</value>
</property>
<property>
<name>hive.security.authenticator.manager</name>
<value>org.apache.hadoop.hive.ql.security.SessionStateUserAuthenticator</value>
</property>
<property>
<name>hive.service.metrics.file.location</name>
<value>/var/log/hive/metrics-hiveserver2/metrics.log</value>
</property>
<property>
<name>hive.server2.metrics.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.strict.checks.orderby.no.limit</name>
<value>false</value>
</property>
<property>
<name>hive.strict.checks.no.partition.filter</name>
<value>false</value>
</property>
<property>
<name>hive.strict.checks.type.safety</name>
<value>true</value>
</property>
<property>
<name>hive.strict.checks.cartesian.product</name>
<value>false</value>
</property>
<property>
<name>hive.strict.checks.bucketing</name>
<value>true</value>
</property>
<property>
<name>hive.service.metrics.file.frequency</name>
<value>30000</value>
</property>
<property>
<name>hive.lock.query.string.max.length</name>
<value>10000</value>
</property>
<property>
<name>hive.metastore.connect.retries</name>
<value>10</value>
</property>
<property>
<name>hive.metastore.event.listeners</name>
<value></value>
</property>
<property>
<name>hive.server.thrift.socket.timeout</name>
<value>1000</value>
</property>
<property>
<name>hive.client.thrift.socket.timeout</name>
<value>1000</value>
</property>
<property>
<name>hadoop.security.credential.provider.path</name>
<value>localjceks://file/{{CMF_CONF_DIR}}/creds.localjceks</value>
</property>
</configuration>
Created 08-12-2021 10:38 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Created 08-13-2021 04:50 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @amitshanker
Thanks for the update.
I can see than you had set hive.server2.webui.use.spnego as true, that means kerberos is enabled in your cluster.
If spnego and kerberos are enabled then few settings needs to be changed in browser. Could you please follow the below link.
