Member since
10-28-2016
9
Posts
1
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6298 | 11-15-2016 11:40 AM | |
9433 | 11-14-2016 01:20 PM |
06-05-2019
03:01 AM
curl -X PUT -L --anyauth -u : -b cookie.jar "http://httpfs_ip:14000/webhdfs/v1/user/file.csv?op=CREATE&data=true&user.name=hdfs" --header "Content-Type:application/octet-stream" --header "Transfer-Encoding:chunked" -T "file.csv" Just replace httpfs_ip and file.csv
... View more
03-19-2019
07:48 AM
Hi. i got solution for same. please map hostfile in local machine with cluster nodes.it will work without any issue. Thanks HadoopHelp
... View more
11-15-2016
11:40 AM
Hello, kind of figured out a few things here ... there was a mental block on local vs hdfs paths that the error messages didn't really give clues. the property file defines the oozie.wf.application.path which is the base directory for the supplied jars, the jar path was preceeded by ${nodeName} which is an absolute path. you have to logically put together the Configuration (property) file with the Definition (workflow.xml) to make sure every thing is being addressed properly. Also digging in the spark code yielded the System.exit(101) error ... classNotFound, which was an indicator that an invalid path was being used in our configuration. hope this helps, Craig
... View more
11-14-2016
01:20 PM
1 Kudo
Hello, this finally got resolved, I was missing the <file> tag in the workflow.xml <exec>/user/mrt1/oozie-oozi/craigTest.sh</exec> <file>/user/mrt1/oozie-oozi/craigTest.sh#craigTest.sh</file> A colleague suggested adding the file tag and trying it, low and behold it worked. I then went back to Hue and saw the "files+" section added the full path there and it worked. The error message kind of makes sense, but I would think Hue wouldn't let me create the shell action without the <file> tag. I guess it means never create a workflow.xml with only the <exec> tag. Craig
... View more