Member since
01-30-2017
49
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3489 | 02-23-2017 07:54 AM |
04-15-2017
03:42 AM
Thank you for your response. Is there any way to repair this file? Downloading again this file (11GB) is painful task
... View more
04-14-2017
11:55 AM
I am trying
to run HDP 2.5 in VMWare player by selecting “Open a Virtual Machine” button
but every time it is throwing an error saying: Failed to open virtual machine: SHA1 digest of
file
HDP_2.5_docker_vmware_25_10_2016_08_59_25_hdp_2_5_0_0_1245_ambari_2_4_0_0_1225-disk1.vmdk
does not match manifest. Please
provide a solution to me how to run it successfully ?
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
04-07-2017
09:38 AM
I am working on an application where I am making use of StreamingContext as well as SPark SQLContext. Now in the same application I am writing HiveContext as well, but it is throwing error saying rg.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: ALthough I have set spark.driver.allowMultipleContexts = true in SparkConfig, but no luck. Could you please tell me how to proceed on this?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Spark
04-06-2017
12:06 PM
In Sprig I am running Spark Application. Now I want to connect to HIVE to run HIVE query in Spring Suite itself. How to do this? I learned that HiveContext could be used but am clueless how to use this?
... View more
Labels:
03-27-2017
08:50 AM
I have installed Spark on RedHat Centos 6. Installed:Java 1.8,spark-2.1.0-bin-hadoop2.7,Scala 2.12 Environment Variable set for Hadoop config HADOOP_CONF_DIR Hadoop directory contains hdfs-site.xml , core- site.xml While executing I am getting below Warning and I am not able to write HDFS 17/03/27 03:48:18 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/03/27 03:48:18 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.4.124.113:4040
17/03/27 03:48:18 INFO SparkContext: Added JAR file:/storm/Teja/spark/target/uber-spark_kafka-0.0.1-SNAPSHOT.jar at spark://10.4.124.113:50101/jars/uber-spark_kafka-0.0.1-SNAPSHOT.jar with timestamp 1490600898913
17/03/27 03:48:20 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
17/03/27 03:48:20 INFO RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
17/03/27 03:48:21 INFO Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS)
17/03/27 03:48:22 INFO Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS)
17/03/27 03:48:23 INFO Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime
... View more
Labels:
- Labels:
-
Apache Spark
03-03-2017
02:54 PM
@Matt Clarke Hi Matt, Thank you I understood it now, I will go through documentation to learn more on this. Thanks
... View more
03-03-2017
02:36 PM
@Matt Burgess Thank your for your response. Which all processor supports dynamic properties. Any link would be helpful. Thanks,
... View more
03-03-2017
02:11 PM
PFA image of same
... View more
03-03-2017
02:09 PM
1 Kudo
If I am adding a property from UI it is giving me an error saying property is not valid Although using UpdateAttribute processor we update the existing attribute but I am looking for ways to add new property in a processor say 'GetFile' and I want to add a property say 'nickname' PFA image of same
... View more
Labels:
- Labels:
-
Apache NiFi