Member since
08-26-2019
5
Posts
0
Kudos Received
0
Solutions
08-30-2019
11:17 AM
Thanks for your reply @EricL , The connection between nodes are fine, I edited hive-site.xml with these parameters and its working now, but I'm not sure why the timeout was happening set hive.spark.client.connect.timeout 360000ms set hive.spark.client.server.connect.timeout 360000ms BR
... View more
08-27-2019
07:09 AM
My guess is that timeout settings are not being taken into account. And as my test environment, I can have a latency greater than 1s I found some warnings that explain my guess: 2019-08-27T10:52:10,045 INFO [spark-submit-stderr-redir-05681b44-ae8a-42d9-a80d-20dad05faa98 main] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.server.connect.timeout=90000
2019-08-27T10:52:10,046 INFO [spark-submit-stderr-redir-05681b44-ae8a-42d9-a80d-20dad05faa98 main] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.threads=8
2019-08-27T10:52:10,046 INFO [spark-submit-stderr-redir-05681b44-ae8a-42d9-a80d-20dad05faa98 main] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.future.timeout=60000
2019-08-27T10:52:10,046 INFO [spark-submit-stderr-redir-05681b44-ae8a-42d9-a80d-20dad05faa98 main] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.connect.timeout=1000
2019-08-27T10:52:10,046 INFO [spark-submit-stderr-redir-05681b44-ae8a-42d9-a80d-20dad05faa98 main] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.secret.bits=256
2019-08-27T10:52:10,053 INFO [spark-submit-stderr-redir-05681b44-ae8a-42d9-a80d-20dad05faa98 main] client.SparkClientImpl: Warning: Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800
... View more
08-26-2019
01:40 PM
I'm studying CDH 6.3.0 with hive and spark and I'm facing for a problem that held me for a week. I already installed it from scratch and nothing solved.
The timeout occurs when I try to select from a table.
Considering this :
DROP TABLE dashboard.top10;
CREATE TABLE dashboard.top10 (id VARCHAR(100), floatVal DOUBLE)
STORED AS ORC tblproperties("compress.mode"="SNAPPY");
INSERT into dashboard.top10 SELECT * from analysis.total_raw order by floatVal DESC limit 10;
Error while processing statement: FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session faf8afcb-0e43-4097-8dcb-44f3f1445005_0: java.util.concurrent.TimeoutException: Client 'faf8afcb-0e43-4097-8dcb-44f3f1445005_0' timed out waiting for connection from the Remote Spark Driver
The container is exiting and here is the full log:
exception: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting to connect to HiveServer2. at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:155) at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:559) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:673) Caused by: java.util.concurrent.TimeoutException: Timed out waiting to connect to HiveServer2. at org.apache.hive.spark.client.rpc.Rpc$2.run(Rpc.java:120) at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38) at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120) at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at java.lang.Thread.run(Thread.java:748) ) 19/08/26 17:15:11 ERROR yarn.ApplicationMaster: Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226) at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:447) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala) Caused by: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting to connect to HiveServer2. at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:155) at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:559) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:673) Caused by: java.util.concurrent.TimeoutException: Timed out waiting to connect to HiveServer2. at org.apache.hive.spark.client.rpc.Rpc$2.run(Rpc.java:120) at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38) at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120) at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at java.lang.Thread.run(Thread.java:748) 19/08/26 17:15:11 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://masternode.vm:8020/user/root/.sparkStaging/application_1566847834444_0003 19/08/26 17:15:16 INFO util.ShutdownHookManager: Shutdown hook called
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Spark
08-26-2019
07:08 AM
I'm facing same problem INFO : Compiling command(queryId=hive_20190826110448_0f47045d-b2f4-4778-817b-da39d9b65325):
INSERT into dashboard.top10_divida SELECT * from analysis.total_divida_tb
order by total_divida DESC
limit 10
INFO : Semantic Analysis Completed
INFO : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:total_divida_tb.contribuinte, type:varchar(100), comment:null), FieldSchema(name:total_divida_tb.total_divida, type:double, comment:null)], properties:null)
INFO : Completed compiling command(queryId=hive_20190826110448_0f47045d-b2f4-4778-817b-da39d9b65325); Time taken: 1.937 seconds
INFO : Executing command(queryId=hive_20190826110448_0f47045d-b2f4-4778-817b-da39d9b65325):
INSERT into dashboard.top10_divida SELECT * from analysis.total_divida_tb
order by total_divida DESC
limit 10
INFO : Query ID = hive_20190826110448_0f47045d-b2f4-4778-817b-da39d9b65325
INFO : Total jobs = 3
INFO : Launching Job 1 out of 3
INFO : Starting task [Stage-1:MAPRED] in serial mode
ERROR : FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session 1b4603c3-f2ee-42db-8248-d996710793fc_0: java.lang.RuntimeException: spark-submit process failed with exit code 1 and error ?
INFO : Completed executing command(queryId=hive_20190826110448_0f47045d-b2f4-4778-817b-da39d9b65325); Time taken: 12.323 seconds
... View more