Member since
04-25-2016
579
Posts
609
Kudos Received
111
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2357 | 02-12-2020 03:17 PM | |
1642 | 08-10-2017 09:42 AM | |
11212 | 07-28-2017 03:57 AM | |
2681 | 07-19-2017 02:43 AM | |
1983 | 07-13-2017 11:42 AM |
06-08-2016
01:33 PM
3 Kudos
@akeezhadath spark assume the your file is on hdfs by default if you have not specified any uri(file:///,hdfs://,s3://) so it your file is on hdfs, you can refrenced it using absolute path like sc.textFile("/user/xyz/data.txt")
... View more
06-08-2016
09:21 AM
can you add the following parameters in spark conf and see if it works spark.executor.extraClassPath /usr/hdp/current/hadoop-client/lib/snappy-java-*.jar
spark.executor.extraLibraryPath /usr/hdp/current/hadoop-client/native
spark.executor.extraJavaOptions -Djava.library.path=/usr/hdp/current/hadoop-client/lib/native/lib
... View more
06-08-2016
09:08 AM
3 Kudos
can you check whether snappy lib is installed on the cluster nodes using command 'hadoop checknative'?
... View more
06-08-2016
07:02 AM
1 Kudo
@vikas reddy could you please share your script along with the complete logs?
... View more
06-08-2016
06:26 AM
@Roberto Sancho it seems that all your queries has been answered, could you please spare some time and accept a best answer in this thread.
... View more
06-08-2016
06:26 AM
@Roberto Sancho it seems that all your queries has been answered, could you please spare some time and accept a best answer in this thread.
... View more
06-07-2016
04:22 PM
2 Kudos
did you try setting dual.commit.enabled to true?
... View more
06-07-2016
11:37 AM
could you please modify your program in this way and see if you still see any excepton from pyspark import SparkConf, SparkContext
from pyspark.sql import HiveContext
sc = SparkContext()
sqlContext = HiveContext(sc)
sqlContext.sql("select * from default.aaa limit 3").write.format("orc").save("test_orc2")
... View more
06-07-2016
07:38 AM
saveAsHadoopFile is applicable for RDD and is not for DF, can you try hive_context.write.format("orc").save("test_orc")
... View more
06-07-2016
07:11 AM
2 Kudos
@alain TSAFACK please use saveAsHadoopFile while will write to hdfs saveAsHadoopFile(<file-name>,
<file output format>",
compressionCodecClass="org.apache.hadoop.io.compress.GzipCodec") or hive_context.write.format("orc").save("test_orc")
... View more