Member since
04-03-2019
962
Posts
1743
Kudos Received
146
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
10888 | 03-08-2019 06:33 PM | |
4747 | 02-15-2019 08:47 PM | |
4064 | 09-26-2018 06:02 PM | |
10361 | 09-07-2018 10:33 PM | |
5422 | 04-25-2018 01:55 AM |
12-24-2015
03:38 AM
Thanks @Artem Ervits - will try this
... View more
12-24-2015
03:38 AM
Thanks @Ana Gillan
... View more
12-22-2015
07:52 AM
4 Kudos
Q1. Need more clarification on below variable substitution, Is it $username who submits the workflow ? HADOOP_USER_NAME=${wf:user()}
... View more
Labels:
- Labels:
-
Apache Oozie
12-22-2015
07:35 AM
5 Kudos
I have a shell action defined in my workflow which moves data from one hdfs location to another, my action is failing because of below error Permission denied: user=xyz, access=EXECUTE, inode="/user/yarn/.staging":yarn:supergroup:drwx------ Below are some useful configurations (got these from application log) yarn.app.mapreduce.am.staging-dir=/user
user.name=yarn
mapreduce.job.user.name=xyz
I also have added below variable in my workflow.xml file under shell action <env-var>HADOOP_USER_NAME=${wf:user()}</env-var> I'm not sure why user "xyz" is trying to read/write at /user/yarn/.staging location. Oozie experts, any idea whats going on here ?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Oozie
12-21-2015
09:27 AM
2 Kudos
@Shigeru Takehara Please refer this - https://developer.ibm.com/hadoop/blog/2015/11/05/r...
... View more
12-21-2015
06:03 AM
8 Kudos
This article is an update of one on http://hadooped.blogspot.com/2013/10/apache-oozie-part-13-oozie-ssh-action_30.html, authored by @Anagha Khanolkar. Below are the steps to setup Oozie workflow using ssh-action:
Step 1. Create job.properties. Example: #*************************************************
# job.properties
#*************************************************
nameNode=hdfs://<namenode-machine-fqdn>:8020
jobTracker=<resource-manager-fqdn>:8050
queueName=default
oozie.libpath=${nameNode}/user/oozie/share/lib
oozie.use.system.libpath=true
oozie.wf.rerun.failnodes=true
oozieProjectRoot=${nameNode}/user/${user.name}
appPath=${oozieProjectRoot}
oozie.wf.application.path=${appPath}
inputDir=${oozieProjectRoot}
focusNodeLogin=<username>@<remote-host-where-you-have-your-shell-script(s)>
shellScriptPath=~/uploadFile.sh
emailToAddress=<email-id>
Step 2. Write workflow.xml Example: <!--******************************************-->
<!--workflow.xml -->
<!--******************************************-->
<workflow-app name="WorkFlowForSshAction" xmlns="uri:oozie:workflow:0.1">
<start to="sshAction"/>
<action name="sshAction">
<ssh xmlns="uri:oozie:ssh-action:0.1">
<host>${focusNodeLogin}</host>
<command>${shellScriptPath}</command>
<capture-output/>
</ssh>
<ok to="sendEmail"/>
<error to="killAction"/>
</action>
<action name="sendEmail">
<email xmlns="uri:oozie:email-action:0.1">
<to>${emailToAddress}</to>
<subject>Output of workflow ${wf:id()}</subject>
<body>Status of the file move: ${wf:actionData('sshAction')['STATUS']}</body>
</email>
<ok to="end"/>
<error to="end"/>
</action>
<kill name="killAction">
<message>"Killed job due to error"</message>
</kill>
<end name="end"/>
</workflow-app>
Step 3. Write sample uploadFile.sh script Example: #!/bin/bash
hadoop fs -put ~/test /user/<username>/uploadedbyoozie
Step 4. Upload workflow.xml to ${appPath} defined in job.properties.
Step 5. Login to Oozie host by "oozie" user.
Step 6. Generate a key pair (if you don't have already ) using 'ssh-keygen' command
Step 7. On Oozie host copy ~/.ssh/id_rsa.pub and paste it on <remote-host>'s ~/.ssh/authorized_keys file (focus node)
Step 8. Test password-less ssh from oozie@oozie-host to <username>@<remote-host>
Step 9. if step 7 succeeds then go ahead and run oozie job. it should complete without error
Note - In order to get password-less ssh working please make sure that: 1. You have 700 permissions on ~/.ssh directory 2. 600 permissions on ~/.ssh/authorized_keys file on remote-host
3. 600 to ~/.ssh/id_rsa
4. 644 to ~/.ssh/id_rsa.pub
... View more
Labels:
12-18-2015
09:22 AM
4 Kudos
@Rajesh Balamohan
helped me to find answer of this question. As per code it always converts given database name in lowercase format Answer of this questions is - Database names in hive will always be in lowecase. https://svn.apache.org/repos/asf/hive/trunk/metast... @Override
public void createDatabase(Database db) throws InvalidObjectException, MetaException {
boolean commited = false;
MDatabase mdb = new MDatabase();
mdb.setName(db.getName().toLowerCase());
mdb.setLocationUri(db.getLocationUri());
mdb.setDescription(db.getDescription());
mdb.setParameters(db.getParameters());
mdb.setOwnerName(db.getOwnerName());
PrincipalType ownerType = db.getOwnerType();
mdb.setOwnerType((null == ownerType ? PrincipalType.USER.name() : ownerType.name()));
try {
openTransaction();
pm.makePersistent(mdb);
commited = commitTransaction();
} finally {
if (!commited) {
rollbackTransaction();
}
}
}
... View more
12-17-2015
11:49 AM
1 Kudo
@bdurai - Thats correct! Thank you.
... View more
12-17-2015
11:30 AM
9 Kudos
Goto terminal of your sandbox in oracle virtualBox --> Press <Alt+F5> --> enter username - root --> enter password - hadoop --> it will ask you to set new password --> set new password --> then try to login via putty using new credentials
... View more