Member since
07-22-2016
15
Posts
2
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1824 | 01-12-2021 03:21 AM | |
1423 | 10-06-2016 12:25 PM | |
1312 | 09-29-2016 07:58 PM |
01-12-2021
03:21 AM
After initial read on Spark Structured Streaming , for custom receivers class named DefaultSource should be present in the package mentioned in the format of readStream.
... View more
01-09-2021
09:24 AM
Hi All, I am new to solace and spark structured streaming. I would like to write a solace consumer using spark structured streaming and with the below code var df = spark.readStream.format("solacestream").load I am getting java.lang.ClassNotFoundException: solacestream.DefaultSource Any hint of the format of the solace jms and how to pass in the connection parameters as option to the spark will be helpful. Thanks
... View more
Labels:
- Labels:
-
Apache Spark
12-10-2019
03:45 AM
@jsensharma All the configs are set properly in our cluster. I am trying to access the external hive tables using hive ware house session. Is that the error because of that as the documentation says its not needed to use HiveWarehouseSession for the external tables.
... View more
12-10-2019
12:19 AM
@Shelton I have tried as you suggested still getting the same error. import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import com.hortonworks.hwc.HiveWarehouseSession;
import com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder;
public class Hdp3MigrationMain extends CommonUtilities{
public static void main(String[] args) {
String hdp3Enabled = args[0];
Dataset<Row> dataset;
String query="select * from hive_schema.table1";
try {
if ("Y".equalsIgnoreCase(hdp3Enabled)) {
HiveWarehouseSession hive = HiveWarehouseBuilder.session(sparkSession).build();
dataset = hive.executeQuery(query);
} else {
dataset = sparkSession.sql(query);
}
dataset.show();
} catch(Exception e) {
e.printStackTrace();
}
} And the same error occurs. java.util.NoSuchElementException: spark.sql.hive.hiveserver2.jdbc.url
at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:1571)
at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:1571)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.internal.SQLConf.getConfString(SQLConf.scala:1571)
at org.apache.spark.sql.RuntimeConfig.get(RuntimeConfig.scala:74)
at com.hortonworks.spark.sql.hive.llap.HWConf.getConnectionUrlFromConf(HWConf.java:143)
at com.hortonworks.spark.sql.hive.llap.HWConf.getConnectionUrl(HWConf.java:107)
at com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.build(HiveWarehouseBuilder.java:97)
at com.wunderman.hdp.Hdp3MigrationMain.main(Hdp3MigrationMain.java:18)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
... View more
12-09-2019
04:16 AM
I am migrating the spark jobs running in HDP 2.6 to HDP 3.1. When executing the spark jobs in HDP3.1 I am getting the following error.
java.util.NoSuchElementException: spark.sql.hive.hiveserver2.jdbc.url
at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:1571)
at org.apache.spark.sql.internal.SQLConf$$anonfun$getConfString$2.apply(SQLConf.scala:1571)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.internal.SQLConf.getConfString(SQLConf.scala:1571)
at org.apache.spark.sql.RuntimeConfig.get(RuntimeConfig.scala:74)
at com.hortonworks.spark.sql.hive.llap.HWConf.getConnectionUrlFromConf(HWConf.java:143)
at com.hortonworks.spark.sql.hive.llap.HWConf.getConnectionUrl(HWConf.java:107)
at com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.build(HiveWarehouseBuilder.java:97)
at com.wunderman.hdp.Hdp3MigrationMain.main(Hdp3MigrationMain.java:16)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The way I m creating the spark session
private static SparkSession getSparkSession() {
/**
* Create an instance of SparkSession to connect to the cluster
*/
SparkSession sparkSession = SparkSession.builder().appName("Hdp3 Migration").master("yarn").getOrCreate();
return sparkSession;
}
But the hiveserver2 jdbc url is configured in the spark config. I have added the following dependency in the pom.xml
<dependency>
<groupId>com.hortonworks.hive</groupId>
<artifactId>hive-warehouse-connector_2.11</artifactId>
<version>1.0.0.3.1.0.0-78</version>
</dependency>
And I am trying to execute the below code
String hdp3Enabled = args[0];
Dataset<Row> dataset;
String query="SELECT * FROM schema.tablename where col1='abc' ; //Sample query
try {
if ("Y".equalsIgnoreCase(hdp3Enabled)) {
HiveWarehouseSession hive = HiveWarehouseSession.session(sparkSession).build();
dataset = hive.executeQuery(query);
} else {
dataset = sparkSession.sql(query);
}
dataset.show();
} catch(Exception e) {
e.printStackTrace();
}
Share your suggestions to fix the issue.
... View more
Labels:
10-06-2016
12:25 PM
1 Kudo
I found the services are not running and the namenode has gone to the safe mode state. I have restarted the services and as mentioned by Lester Martin executed without exec and it worked fine
... View more
10-06-2016
11:32 AM
Thanks @Lester Martin I have tried executing the script without exec from the sandbox I m getting the same info pig -x tez /home/horton/solution/flightdelays_clean.pig INFO: Retrying connect to server:namenode/172.0.5.4:8050: Already tried (7) times; retry policy is RetryUptoMaximumCountWithFixedSleep and it keeps printing.
... View more
10-05-2016
06:59 PM
1 Kudo
Hi, I am practicing for the hdpcd certification in aws practice exam. As per the task i have created a pig script in my sandbox /home/horton/solution/flightdelays_clean.pig and when i try to execute the script from the local i m getting the following errors 1. pig -x tez exec /home/horton/solution/flightdelays_clean.pig Error; File exec not found 2. pig -x tez exec /home/horton/solution/flightdelays_clean.pig INFO: Retrying connect to server:namenode/172.0.5.4:8050: Already tried (7) times; retry policy is RetryUptoMaximumCountWithFixedSleep and it keeps printing. How to execute the pig script from the sandbox in the aws instance.
... View more
Labels:
- Labels:
-
Apache Pig
-
Apache Tez
10-03-2016
08:52 PM
I have given the write access to the destination folder in hdfs and it worked. Thanks for the reply
... View more
10-03-2016
06:32 PM
Hi Team, I am new to linux environment and I m trying to practice my hdpcd exam in AWS instance. I am trying to copy the file from local to hdfs it throws access denied exception. If i tried with hdfs sudo user it is asking for the password for the sudo user horton. If i ssh using root@namenode I m not able to access the local file system, Help me with the steps to do the file transfer in aws instance from local the hdfs system.
... View more
Labels:
- Labels:
-
HDFS