Member since
02-21-2017
8
Posts
2
Kudos Received
0
Solutions
02-07-2018
03:16 PM
You may try to kill all "running" yarn application , to pass "ACCEPTED" status. After that, I hit this error, 18/02/07 15:40:18 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, xxxxxx, 55614, None) 18/02/07 15:40:18 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, xxxxxxx, 55614, None) 18/02/07 15:40:20 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Container marked as failed: container_e12_1517985475199_0009_01_000002 on host: hw-host02. Exit status: 1. Diagnostics: Exception from container-launch. Container id: container_e12_1517985475199_0009_01_000002 Exit code: 1 18/02/07 15:40:18 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.68.30.103, 55614, None) 18/02/07 15:40:18 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.68.30.103, 55614, None) 18/02/07 15:40:20 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Container marked as failed: container_e12_1517985475199_0009_01_000002 on host: hw-host02. Exit status: 1. Diagnostics: Exception from container-launch. Container id: container_e12_1517985475199_0009_01_000002 Exit code: 1 and loop all client... but all failed....
... View more
03-22-2017
04:27 AM
Hi Artem, I'm currently stuck in a particular use case where in I'm trying to access Hive Table data using spark.read.jdbc as shown below: export SPARK_MAJOR_VERSION=2 spark-shell import org.apache.spark.sql.{DataFrame, Row,SparkSession} val connectionProperties = new java.util.Properties() val hiveQuery = "(SELECT * from hive_table limit 10) tmp" val hiveResult = spark.read.jdbc("jdbc:hive2://hiveServerHostname:10000/hiveDBName;user=hive;password=hive", hiveQuery, connectionProperties).collect() But when I check for the results in hiveResult it's just empty. Could you please suggest what's going on here? I know we can access Hive tables using HiveSesssion and I've successfully tried that but is it possible to run hive queries and access Hive data using the above method?
... View more