Reply
Explorer
Posts: 12
Registered: ‎11-10-2018

different ways of doing analyses on structure data in hadoop

we can do analyses on structure data by different ways in hadoop


1) by using hive queries in hive
:-> can we store output of these queries back to hdfs . please give example

 

2) by using RDD in spark
:->can we store output after applying transformation and action back to hdfs  . please give example

 

3) by using dataframe in spark
:->can we store output of dataframe back to hdfs  . please give example

 

4) by using sparksql
:-> can we store output of sparksql back to hdfs  . please give example

 

Highlighted
Posts: 57
Topics: 0
Kudos: 7
Solutions: 5
Registered: ‎05-15-2018

Re: different ways of doing analyses on structure data in hadoop

Hello @vivek_rol

 

Thanks for posting your query!

 

From your query I could see, you would like to write the outputs of Hive query, RDDs (after transformations), Dataframes and SparkSQL (after transformations) to HDFS

 

Basically for Spark's RDD,Dataframes,SparkSQL you can use inbuilt functions like "saveAsTextFile

 

Example: 

<Writing DataFrames in to HDFS >

 

val mydf=sc.textFile("/tmp/input");

mydf.saveAsTextFile("/tmp/output");

 

Similarly, you can perform for D

Please refer spark documentation URL [http://spark.apache.org/docs/latest/programming-guide.html]

 

For Hive queries, please refer below community thread and once the output is on file, you can place it to HDFS

 

[ https://community.cloudera.com/t5/Batch-SQL-Apache-Hive/how-to-download-hive-data-into-csv-format/m-... ]

 

If you are using Hue for running your Hive queries, you have direct option of exporting your out put to HDFS paths

 

 

Thanks,
Satz
Announcements
New solutions