Created 02-02-2017 05:59 PM
Just wondering if spark supports Reading *.gz files from an s3 bucket or dir as a Dataframe or Dataset.. I think we can read as RDD but its still not working for me. Any help would be appreciated. Thank you.
iam using s3n://.. but spark says invalid input path exception.
val df = spark.sparkContext.textFile("s3n://..../*.gz)
doesnt work for me 😞
I prefer to the s3 dir of .gz files as a DF or Dataset if possible else atleast RDD please. thank you
Created 02-04-2017 05:38 PM
I believe you need to escape the wildcard: val df = spark.sparkContext.textFile("s3n://..../\*.gz).
Additionally, the S3N filesystem client, while widely used, is no longer undergoing active maintenance except for emergency security issues. The S3A filesystem client can read all files created by S3N. Accordingly it should be used wherever possible.
Please see: https://github.com/apache/hadoop/blob/trunk/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-a... for the s3a classpath dependencies and authentication properties you need to be aware of.
A nice tutorial on this subject can be found here: https://community.hortonworks.com/articles/36339/spark-s3a-filesystem-client-from-hdp-to-access-s3.h...
Created 02-04-2017 05:38 PM
I believe you need to escape the wildcard: val df = spark.sparkContext.textFile("s3n://..../\*.gz).
Additionally, the S3N filesystem client, while widely used, is no longer undergoing active maintenance except for emergency security issues. The S3A filesystem client can read all files created by S3N. Accordingly it should be used wherever possible.
Please see: https://github.com/apache/hadoop/blob/trunk/hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-a... for the s3a classpath dependencies and authentication properties you need to be aware of.
A nice tutorial on this subject can be found here: https://community.hortonworks.com/articles/36339/spark-s3a-filesystem-client-from-hdp-to-access-s3.h...
Created 02-08-2017 02:00 PM
Created 02-13-2017 07:21 PM
There's also the documentation here: https://hortonworks.github.io/hdp-aws/s3-spark/