Member since
04-12-2019
105
Posts
3
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3667 | 05-28-2019 07:41 AM | |
2224 | 05-28-2019 06:49 AM | |
1825 | 12-20-2018 10:54 AM | |
1292 | 06-27-2018 09:05 AM | |
7004 | 06-27-2018 09:02 AM |
01-03-2019
05:34 AM
@Geoffrey Shelton Okot No luck. Pre-emption is already enabled via yarn config and all other prerequisite has completed. Hive interactive query service is running fine. Still 19/01/03 05:16:45 INFO RetryingMetaStoreClient: RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=vinay@TEST.COM (auth:KERBEROS) retries=1 delay=5 lifetime=0 19/01/03 05:16:47 INFO CodeGenerator: Code generated in 294.781928 ms 19/01/03 05:16:47 INFO CodeGenerator: Code generated in 18.011739 ms +------------+
|databaseName|
+ ------------+
| default|
+------------+
... View more
01-02-2019
02:24 PM
@Geoffrey Shelton Okot Ohh. I did not enable the pre-emption via yarn config, It is only point which is pending. Rest of part, i have completed. let me check with enable yarn pre-emption. Will update you once done it.
... View more
01-02-2019
12:54 PM
Hi Subhash
below is code from pyspark import SparkConf
from pyspark.sql import SparkSession, HiveContext
from pyspark.sql import functions as fn
from pyspark.sql.functions import rank,sum,col
from pyspark.sql import Window
sparkSession = (SparkSession
.builder
.master("local")
.appName('sprk-job')
.enableHiveSupport()
.getOrCreate())
sparkSession.sql("show databases").show()
sparkSession.stop()
Even i'm also trying from spark-shell.
... View more
01-02-2019
12:09 PM
Hi Subhash I have already added spark user for access the all database by ranger and all HDFS storage path.
... View more
01-02-2019
08:37 AM
@Geoffrey Shelton Okot Could you please confirm do we really need to enable Interactive query? because after enable Interactive query, i'm unable to start interactive query service. Below are the logs: 2019-01-02T08:36:41,455 WARN [main] cli.LlapStatusServiceDriver: Watch mode enabled and got YARN error. Retrying..
2019-01-02T08:36:43,462 WARN [main] cli.LlapStatusServiceDriver: Watch mode enabled and got YARN error. Retrying..
2019-01-02T08:36:45,469 WARN [main] cli.LlapStatusServiceDriver: Watch mode enabled and got YARN error. Retrying..
2019-01-02T08:36:47,476 INFO [main] LlapStatusServiceDriverConsole: LLAP status unknown
... View more
01-02-2019
05:31 AM
@Geoffrey Shelton Okot Hive and spark client has already installed on hive and spark node.
... View more
12-31-2018
09:13 AM
Hi I'm facing same issue. I have also added following properties in spark configuration. But still it's not working. spark.sql.hive.hiveserver2.jdbc.url:jdbc:hive2://hive_server_FQDN:10000/ spark.datasource.hive.warehouse.metastoreUri: thrift://hive_server_FQDN:9083 spark.datasource.hive.warehouse.load.staging.dir: /tmp spark.hadoop.hive.llap.daemon.service.hosts: @llap0 spark.hadoop.hive.zookeeper.quorum: hadoop3.test.com:2181 spark.sql.hive.hiveserver2.jdbc.url.principal: hive/_HOST@TEST.COM
... View more
12-28-2018
11:18 AM
@Geoffrey Shelton Okot I had copied manually /etc/hive/conf/hive-site.xml to /etc/spark2/conf/ and restarted spark service. After restart /etc/spark2/conf/hive-site.xml changed to previous hive-site.xml which i had replaced. Latest status is, still not able to see hive database by spark. even i have also added below properties in spark configuration: spark.sql.hive.hiveserver2.jdbc.url.principal spark.hadoop.hive.zookeeper.quorum spark.hadoop.hive.llap.daemon.service.hosts spark.datasource.hive.warehouse.load.staging.dir spark.datasource.hive.warehouse.metastoreUri spark.sql.hive.hiveserver2.jdbc.url
... View more
12-28-2018
11:12 AM
@subhash parise i have also tried spark-shell --conf spark.sql.hive.hiveserver2.jdbc.url="jdbc:hive2://************************:10000/" spark.datasource.hive.warehouse.load.staging.dir="/tmp" spark.hadoop.hive.zookeeper.quorum="ip.************:2181" --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.0.0-1634.jar but same result. Only Default database is visible.
... View more
12-28-2018
07:18 AM
@Geoffrey Shelton Okot I had already defined spark.sql.hive.hiveserver2.jdbc.url.principal=hive/_HOST@TEST.COM in configuration.
... View more