Member since
05-16-2016
270
Posts
18
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1717 | 07-23-2016 11:36 AM | |
3056 | 07-23-2016 11:35 AM | |
1566 | 06-05-2016 10:41 AM | |
1161 | 06-05-2016 10:37 AM |
06-08-2016
08:26 AM
I struggled through this for longer than a while and then decided to share it here: The trick is to basically delete all the symlinks pointing back to locations where HDP components reside since that is what causes 80% of the problem. Here is a step by step tutorial for that: http://www.yourtechchick.com/hadoop/how-to-completely-remove-and-uninstall-hdp-components-hadoop-uninstall-on-linux-system/ Hope that helps!
... View more
06-08-2016
08:17 AM
I struggled through the same issue for pretty long time and then decided to share it here : http://www.yourtechchick.com/hadoop/how-to-completely-remove-and-uninstall-hdp-components-hadoop-uninstall-on-linux-system/ Hope that helps!
... View more
06-07-2016
09:51 PM
There you go! Here's the solution to your problem. I struggled with this for like weeks and tried uninstalling and reinstalling over and over and then finally decided to share it here : How to completely remove and uninstall HDP
... View more
06-07-2016
05:07 AM
@Divakar Annapureddy: I am using rhdfs but no major added advantages of using rhive. It looks like rhdfs only with all its functions. It is a little more polished though and offers a bit more functionality than rhdfs.
... View more
06-07-2016
04:51 AM
The file that gets written in hdfs with hdfs.write without specifying the file type has no extension at all. So, I actually needed to know what is the default format the hdfs.write would write in ?How do I specify the file type I would like to store the data in? @Constantin Stanca
... View more
06-07-2016
04:49 AM
What I ended up doing is pretty stupid. I used write.csv and wrote it locally and then usd hdfs.put to move it to hdfs. Data type of data is list. How do I convert it to csv before writing it in hdfs using hdfs.write ? @Constantin Stanca . Thank you so much for your response though. I hope to hear back on this.
... View more
06-06-2016
10:35 AM
1. I added a CSV in HDFS using R script. 2. I update this CSV with new CSV/append data to it 3. Created table using hue in Hive over this CSV. 4. Altered it to be an external table. Now, if when data is changed in the hdfs location, Would data be automatically updated in hive table?
... View more
Labels:
- Labels:
-
Apache Hive
06-06-2016
06:07 AM
I wrote the following function to write data in HDFS using R and am using rhdfs. writeToHDFS <-function(fileName){
hdfs.init()
modelfile <- hdfs.file(fileName,"w")
hdfs.write(get(fileName), modelfile)
hdfs.close(modelfile)} How do I modify it store this data in CSV format instead?I have tried using pipe but since it is deprecated, I would like a way to write CSV files through hdfs.write functions. I tried this: modelfile <- hdfs.file(paste(fileName,"csv", sep="."),"w") but I do not think it creates a valid CSV but only appends the extension for it.
... View more
Labels:
- Labels:
-
Apache Hadoop
06-05-2016
10:41 AM
Alright. It was pretty tricky and unintuitive but I have shared an elaborate solution here: http://www.yourtechchick.com/sqoop/sqoop-job-fails-to-record-password-after-running-for-the-first-time-problem-fix/
... View more
06-05-2016
10:37 AM
Alright! It took fair amount of time to come up with this fix. I have shared the problem with an elaborate solution here:http://www.yourtechchick.com/sqoop/sqoop-job-fails-to-record-password-after-running-for-the-first-time-problem-fix/
... View more
- « Previous
- Next »