Member since
09-10-2016
82
Posts
6
Kudos Received
9
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 8287 | 08-28-2019 11:07 AM |
11-08-2019
05:58 AM
Can you please assist on this. Thanks
... View more
11-06-2019
02:58 AM
Hi,
We are getting the error while executing Hive query on spark as execution engine.
Hive version: 1.2.1, Spark version : 1.6
set hive.execution.engine=spark; set spark.home=/usr/hdp/current/spark-client; set hive.execution.engine=spark; set spark.master=yarn-client; set spark.eventLog.enabled=true; set spark.executor.memory=512m; set spark.executor.cores=2; set spark.driver.extraClassPath=/usr/hdp/current/hive-client/lib/hive-exec.jar;
Query ID = svchdpir2d_20191106105445_a9ebc8a2-9c28-4a3d-ac5e-0a8609e56fd5 Total jobs = 1 Launching Job 1 out of 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer=<number> In order to limit the maximum number of reducers: set hive.exec.reducers.max=<number> In order to set a constant number of reducers: set mapreduce.job.reduces=<number> Starting Spark Job = c6cc1641-20ad-4073-ab62-4f621ae595c8 Status: SENT Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)' FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html#release for an explanation.
Could you please help on this.
Thank you
... View more
Labels:
08-28-2019
11:07 AM
Hi @jsensharma, After changing SPARK_HISTORY_OPTS in Advanced spark2-env as below, spark2 history server UI started working. Do you think below config change is the fix for this issue? please advise. Thanks From: export SPARK_HISTORY_OPTS='-Dspark.ui.filters=org.apache.hadoop.security.authentication.server.AuthenticationFilter -Dspark.org.apache.hadoop.security.authentication.server.AuthenticationFilter.params="type=kerberos,kerberos.principal={{spnego_principal}},kerberos.keytab={{spnego_keytab}}"' To: export SPARK_HISTORY_OPTS='-Dspark.org.apache.hadoop.security.authentication.server.AuthenticationFilter.params="type=kerberos,kerberos.principal={{spnego_principal}},kerberos.keytab={{spnego_keytab}}"'
... View more
08-28-2019
06:50 AM
Hi @jsensharma , This is the first time we are deploying spark2 in our cluster. Thanks
... View more
08-28-2019
06:19 AM
Hi @jsensharma , Please find the java version below: lrwxrwxrwx. 1 root root 28 Jul 31 16:20 latest -> /usr/java/jdk1.8.0_221-amd64 $ ps -ef | grep spark | grep -v grep
/usr/java/jdk1.8.0_221-amd64/bin/java -Dhdp.version=3.1.0.0-78 -cp /usr/hdp/current/spark2-historyserver/conf/:/usr/hdp/current/spark2-historyserver/jars/*:/usr/hdp/3.1.0.0-78/hadoop/conf/ -Dspark.ui.filters=org.apache.hadoop.security.authentication.server.AuthenticationFilter -Dspark.org.apache.hadoop.security.authentication.server.AuthenticationFilter.params=type=kerberos,kerberos.principal=HTTP/xxxx.xxxx.com@CORPxx.xxx.COM,kerberos.keytab=/etc/security/keytabs/spnego.service.keytab -Xmx2048m org.apache spark.deploy.history.HistoryServer $java -version
java version "1.8.0_221"
Java(TM) SE Runtime Environment (build 1.8.0_221-b27)
Java HotSpot(TM) 64-Bit Server VM (build 25.221-b27, mixed mode) Thanks
... View more
08-28-2019
06:07 AM
We have installed Spark2 in HDP 3.1 but when we are trying to access spark2 history server UI,we are getting below issue. HTTP ERROR 403
Problem accessing /. Reason:
java.lang.IllegalArgumentException log: spark-spark-org.apache.spark.deploy.history.HistoryServer-1-xxxxxx.visa.com.out 19/08/28 13:01:21 DEBUG AuthenticationFilter: Request [http://xxxxxxxx:18081/favicon.ico] triggering authentication. handler: class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler
19/08/28 13:01:21 DEBUG AuthenticationFilter: Authentication exception: java.lang.IllegalArgumentException
org.apache.hadoop.security.authentication.client.AuthenticationException: java.lang.IllegalArgumentException
at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.authenticate(KerberosAuthenticationHandler.java:306)
at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:536)
at org.spark_project.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
at org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
at org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
at org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
at org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
at org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.spark_project.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:448)
at org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
at org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.spark_project.jetty.server.Server.handle(Server.java:539)
at org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:333)
at org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
at org.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283)
at org.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:108)
at org.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
at org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
at org.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalArgumentException
at java.nio.Buffer.limit(Buffer.java:275)
at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:365)
at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:358)
at org.apache.hadoop.security.authentication.util.KerberosUtil.getTokenServerName(KerberosUtil.java:291)
at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.authenticate(KerberosAuthenticationHandler.java:285)
... 22 more HDP - HDP 3.1
Spark 2.3
Kerberized cluster Could you please help on this. Thank you
... View more
Labels:
07-12-2019
06:38 PM
Hi @Shu - Thanks for your response. Is there any way where we can enable the DFS commands where Ranger authorization is enabled? As of now, dfs commands are working in hive shell but not in beeline. Thank you.
... View more
07-11-2019
02:03 PM
Hi, In the kerberized cluster, after enabling Ranger Hive plug-in in HDP 2.6.5, not able to run dfs commands in beeline. jdbc:hive2://test.xyz.com:10000> dfs -ls /; Error: Error while processing statement: Permission denied: user [user1] does not have privilege for [DFS] command (state=,code=1) Please help on this. Thank you.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
06-20-2019
06:55 PM
1 Kudo
Hi, We are getting below warnings while starting hive shell from CLI. $hive 19/06/20 07:53:14 WARN conf.HiveConf: HiveConf of name hive.mapred.strict does not exist 19/06/20 07:53:14 WARN conf.HiveConf: HiveConf of name hive.mapred.supports.subdirectories does not exist log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender hive> Hive version - 1.2 As per I know, we can safely ignore these warnings but, we need to get rid off these. Could you please help on this. Thanks
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
06-18-2019
06:43 PM
@Shu: Thank you. Could you please let us know, If we are using file format other than Textformat, CSV, how to handle null for timestamp filed?
... View more