Member since
05-31-2016
89
Posts
14
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3970 | 03-10-2017 07:05 AM | |
5846 | 03-07-2017 09:58 AM | |
3431 | 06-30-2016 12:13 PM | |
5612 | 05-20-2016 09:15 AM | |
26874 | 05-17-2016 02:09 PM |
06-20-2017
06:13 PM
Perfect..saved my life!!! For cloudera user's like me, if you are using quickstart vm then issue the below command to get it working cd <zeppelin_dir> ./bin/zeppelin-daemon.sh stop sudo chown -R cloudera:cloudera webapps ./bin/zeppelin-daemon.sh start
... View more
05-29-2017
11:58 AM
If I create a plain target table would I be able to alter the table later? What I mean is, create a plain table and then alter the table to add the buckets? If yes, how?
... View more
05-29-2017
07:42 AM
I am trying to create a table with Bucketing from another table using select but it fails. Here is my query create table tmp CLUSTERED BY (key) INTO 256 BUCKETS as select * from link_table limit 10; And I get the below error FAILED: SemanticException [Error 10068]: CREATE-TABLE-AS-SELECT does not support partitioning in the target table link_table is already bucketed. I have also set the property to enforce bucketing. I am not sure if bucketing is supported with CTAS. Is there a way I can get this working?
... View more
Labels:
- Labels:
-
Apache Hive
03-10-2017
07:05 AM
"Issue Fixed" I talked with my DevOps later and found that the classpath for Java was not set in few datanodes in the Cluster. This was stopping the shell action to invoke the JVM at those datanodes. After fixing the Classpath, the job ran successfully
... View more
03-09-2017
03:36 PM
That is carriage return from the log while copying the content. I have created the JAVA_PATH accordingly.
... View more
03-09-2017
02:52 PM
I got the below error after those changes mentioned by you. <a href="http://10.241.1.164:8888/filebrowser/view=/opt/cloudera/parcels/CDH-5.5.2-1.cdh5.5.2.p0.4/lib/hadoop-hdfs/bin/hdfs%3A">/opt/cloudera/parcels/CDH-5.5.2-1.cdh5.5.2.p0.4/bin/../lib/hadoop-hdfs/bin/hdfs:</a> line 309: <a href="http://10.241.1.164:8888/filebrowser/view=/usr/java/jdk1.7.0_67/bin/java%3A">/usr/java/jdk1.7.0_67/bin/java:</a> No such file or directory
/opt/cloudera/parcels/CDH-5.5.2-1.cdh5.5.2.p0.4/bin/../lib/hadoop-hdfs/bin/hdfs: line 309: exec: <a href="http://10.241.1.164:8888/filebrowser/view=/usr/java/jdk1.7.0_67/bin/java%3A">/usr/java/jdk1.7.0_67/bin/java:</a> cannot execute: No such file or directory
./test.sh: line 30: <a href="http://10.241.1.164:8888/filebrowser/view=/usr/java/jdk1.7.0_67/bin/java%3A">/usr/java/jdk1.7.0_67/bin/java:</a> No such file or directory
... View more
03-09-2017
01:52 PM
I am running a java program from a shell script through Oozie and I get the below error java: command not found
When I run the shell script from the edge node I do not find any issues and the java class runs without any error and I get the desired output also. However it is the oozie job that fails to run the java command. All other actions in oozie are executed porperly but when it encounters the java line, it throws the afore said error. I understand that all the nodes in the Hadoop cluster will have Java installed, then why do I get this error? Below is the java command that I have in my shell script ...
...
java -cp $LOCAL_DIR/libs/integration-tools.jar com.audit.reporting.GenerateExcelReport $LOCAL_DIR/input.txt $LOCAL_DIR/
... Please provide your thoughts.
... View more
Labels:
- Labels:
-
Apache Oozie
03-07-2017
09:58 AM
After a tireless research on the internet I was able to crack the solution for the issue.
I have added a configuration to use the metastore server for the Hive job and it worked.
Here is what I did to the Hive action. ....
<hive xmlns='uri:oozie:hive-action:0.2'>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>hive.metastore.uris</name>
<value>thrift://10.155.1.63:9083</value>
</property>
</configuration>
<script>${dir}/gsrlQery.hql</script>
<param>OutputDir=${jobOutput}</param>
</hive>
.... Note: replace the hive metatore ip accordingly if you are trying to fix a similar problem. To get the metastore details check the hive-site.xml file located in /etc/hive/conf dir. Credit: MapR
... View more
03-07-2017
08:33 AM
I get an error while running an Oozie workflow with Hive queries. Here is the workflow <workflow-app xmlns='uri:oozie:workflow:0.5' name='reporting_W_errorAuditHiveQueryExe'>
<start to="hive_report_fork"/>
<fork name="hive_report_fork">
<path start="hiveGSRLfile"/>
<path start="hiveNGSRLfile"/>
<path start="hiveNGsrlRAfile"/>
</fork>
<action name="hiveGSRLfile">
<hive xmlns='uri:oozie:hive-action:0.2'>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<script>${dir}/gsrlQery.hql</script>
<param>OutputDir=${jobOutput}</param>
</hive>
<ok to="joining"/>
<error to="joining"/>
</action>
<action name="hiveNGSRLfile">
<hive xmlns='uri:oozie:hive-action:0.2'>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<script>${dir}/nongsrlQuery.hql</script>
<param>OutputDir=${jobOutput}</param>
</hive>
<ok to="joining"/>
<error to="joining"/>
</action>
<action name="hiveNGsrlRAfile">
<hive xmlns='uri:oozie:hive-action:0.2'>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<script>${dir}/nongsrlRAQuery.hql</script>
<param>OutputDir=${jobOutput}</param>
</hive>
<ok to="joining"/>
<error to="joining"/>
</action>
<join name= "joining" to="Success"/>
<action name="Success">
<email xmlns="uri:oozie:email-action:0.1">
<to>${failureEmailToAddress}</to>
<subject>Success</subject>
<body>
The workflow ${wf:name()} with id ${wf:id()} failed
[${wf:errorMessage(wf:lastErrorNode())}].
</body>
</email>
<ok to="end" />
<error to="fail" />
</action>
<action name="failure">
<email xmlns="uri:oozie:email-action:0.1">
<to>${failureEmailToAddress}</to>
<subject>Failure</subject>
<body>
The workflow ${wf:name()} with id ${wf:id()} failed
[${wf:errorMessage(wf:lastErrorNode())}].
</body>
</email>
<ok to="end" />
<error to="fail" />
</action>
<kill name="fail">
<message>Workflow failed</message>
</kill>
<end name="end"/>
</workflow-app>
And here is the oozie properties file oozie.wf.application.path=${deploymentPath}/workflows/errorAuditHiveQueryExe.xml
deploymentPath=/user/amin/deploy_178
jobTracker=localhost:8032
nameNode=hdfs://nameservice1
dir=${deploymentPath}/data-warehouse/temp
failureEmailToAddress=amin@dnb.com
jobOutput=${dir}
oozie.use.system.libpath=true Here is the error I get: FAILED: SemanticException [Error 10072]: Database does not exist: testnamespace
Intercepting System.exit(10072)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [10072] However the namespace exist and I can query the tables inside it. What could be wrong here? Please help.
... View more
Labels:
- Labels:
-
Apache Oozie