Support Questions

Find answers, ask questions, and share your expertise

different ways of doing analyses on structure data in hadoop


we can do analyses on structure data by different ways in hadoop

1) by using hive queries in hive
😆 can we store output of these queries back to hdfs . please give example


2) by using RDD in spark
:->can we store output after applying transformation and action back to hdfs  . please give example


3) by using dataframe in spark
:->can we store output of dataframe back to hdfs  . please give example


4) by using sparksql
😆 can we store output of sparksql back to hdfs  . please give example



Expert Contributor

Hello @vivek_rol


Thanks for posting your query!


From your query I could see, you would like to write the outputs of Hive query, RDDs (after transformations), Dataframes and SparkSQL (after transformations) to HDFS


Basically for Spark's RDD,Dataframes,SparkSQL you can use inbuilt functions like "saveAsTextFile



<Writing DataFrames in to HDFS >


val mydf=sc.textFile("/tmp/input");



Similarly, you can perform for D

Please refer spark documentation URL []


For Hive queries, please refer below community thread and once the output is on file, you can place it to HDFS


[ ]


If you are using Hue for running your Hive queries, you have direct option of exporting your out put to HDFS paths