Member since
04-08-2016
25
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
14929 | 07-18-2016 03:42 PM |
11-30-2016
03:26 PM
1 Kudo
See comment to answer above on how to get configs to local.
... View more
11-29-2016
08:06 PM
1 Kudo
@Dagmawi Mengistu Is "ambari-server.hostname" is actually your ambari server hostname ? Can you try changing it to "*" and then retest the same? something like this: hadoop.proxyuser.ec2-user.groups = *
hadoop.proxyuser.ec2-user.hosts = *
hadoop.proxyuser.admin.groups = *
hadoop.proxyuser.admin.groups = *
NOTE: If you are running ambari-server daemon under an account name of root then you should add hadoop.proxyuser.root.groups = *
hadoop.proxyuser.root.hosts = * . Also your error indicates that it is not able to write inside the "/user/admin/hive/job/...." directory, which indicates that you have logged in to ambari hive view as "admin" user, so you must do the following: su -l hdfs -c "hdfs dfs -mkdir /user/admin"
su -l hdfs -c "hdfs dfs -chown admin:hdfs /user/admin" .
... View more
11-23-2016
07:23 PM
@Dagmawi Mengistu Happy to help. If you don't need any more detail then feel free to accept the answer so we can close out the issue. Thanks!
... View more
02-01-2019
11:24 PM
In case if RDD is partitioned, does zipwithIndex produce the unique key??
... View more
07-18-2016
03:42 PM
Try this, but this version is for version 1.5 and up data.write.format('com.databricks.spark.csv').options(delimiter="\t", codec="org.apache.hadoop.io.compress.GzipCodec").save('s3a://myBucket/myPath')
... View more
07-02-2016
02:25 PM
5 Kudos
Here is the solution to your problem @Dagmawi Mengistu There are two issues over here, ISSUE 1: If you check your logs, then after relation "f", you get the "java.lang.ClassCastException". Please find the updated steps below with explanation of how to resolve this error( Comments are marked with // prefix) - a = load '/pigsample/Salaryinfo.csv' USING PigStorage(','); b = load '/pigsample/Employeeinfo.csv' USING PigStorage(','); c = filter b by $4 =='Male'; // In relation "d", carefully observer that I have type cast the field at index 0 to int, you need to explicitly do type casting like this in order to avoid the "java.lang.ClassCastException". d = foreach c generate (int)$0 as id:int, $1 as firstname:chararray, $2 as lastname:chararray, $4 as gender:chararray, $6 as city:chararray , $7 as country:chararray, $8 as countrycode:chararray; // Similarly in relation "e", we have to again explicitly type cast the field iD to int. e = foreach a generate (int)$0 as iD:int, $1 as firstname:chararray, $2 as lastname:chararray, $3 as salary:double, ToDate($4, 'MM/dd/yyyy') as dateofhire, $5 as company:chararray; // Relation "f" works perfectly now, doesn't throw any exceptions f = join d by id, e by iD; ISSUE 2 - // In relation "g", you don't need to write f.d::firstname, this will throw org.apache.pig.backend.executionengine.ExecException". You can directly reference the fields present in relation "f" of relation "d" like this - g = foreach f generate d::firstname as firstname; // Print output DUMG g; OUTPUT - (Jonathan) (Gary) (Roger) (Jeffrey) (Steve) (Lawrence) (Billy) (Joseph) (Aaron) (Steve) (Brian) (Robert) Hope this helps 🙂
... View more