Member since
08-03-2019
186
Posts
34
Kudos Received
26
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1956 | 04-25-2018 08:37 PM | |
5878 | 04-01-2018 09:37 PM | |
1593 | 03-29-2018 05:15 PM | |
6759 | 03-27-2018 07:22 PM | |
2005 | 03-27-2018 06:14 PM |
04-01-2018
04:18 PM
@Chen Yimu Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more
04-01-2018
04:15 PM
I am not sure where you are getting this issue, but the number looks too long for an integer! If this is your hive table etc, try using BigInt and not int. Or else post some more info about the issue 🙂
... View more
04-01-2018
04:13 PM
@Christian Lunesa Is this related to the earlier issue?
... View more
04-01-2018
04:10 PM
@Vincent van Oudenhoven Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more
04-01-2018
04:09 PM
@swathi thukkaraju Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more
04-01-2018
04:05 PM
@Aishwarya Sudhakar You need to understand the HDFS directory structure. This is the one which is causing issues to you. Follows some explanation. Let's say the username for these example commands is ash. So when ash tries to create a directory in HDFS with the following command /user/ashhadoop fs -mkdir demo
//This creates a directory inside HDFS directory
//The complete directory path shall be /user/ash/demo it is different than the command given below hadoop fs -mkdir /demo
//This creates a directory in the root directory.
//The complete directory path shall be /demo So a suggestion here is, whenever you try to access the directories, use the absolute path(s) to avoid the confusion. So in this case, when you create a directory using hadoop fs -mkdir demo and loads the file to HDFS using hadoop fs -copyFromLocal dataset.csv demo You file exists at /user/ash/demo/dataset.csv
//Not at /demo So your reference to your spark code for this file should be sc.textFile("hdfs://user/ash/demo/dataset.csv") Hope this helps!
... View more
04-01-2018
03:53 PM
@Aishwarya Sudhakar You need to understand that the directory structure you are mentioning and trying to access is different. //This means there is a directory named demo on root and has a file named dataset.csv
/demo/dataset.csv //This means there is a directory named demo in the user directory of user and has a file named dataset.csv
demo/dataset.csv Now, you do try the following on your terminal to get the username. whoami Now use the output of this command to reach to your dataset.csv file. You will realize that hadoop fs -cat demo/dataset.csv is similar to hadoop fs -cat /user/<your username>/demo/dataset.csv You can evaluate that using the ls command on these directories. hadoop fs -ls demohadoop fs -ls /user/ash/demo Now to access these file(s), use the correct directory reference. scala> val data = MLUtils.loadLibSVMFile(sc, "demo/dataset.csv") Let know if that helps!
... View more
04-01-2018
03:27 PM
@Félicien Catherin Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more
04-01-2018
03:23 PM
@Krishna R Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more
04-01-2018
03:21 PM
@subbiram Padala Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
... View more