Member since
01-31-2015
88
Posts
7
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
19783 | 02-09-2015 09:53 PM |
01-24-2016
05:44 PM
Hi, Thanks Harsh the workaround works for hive action, but we still notice our failed email action started throwing nullpointerexception too. <action name="killEmail"> <email xmlns="uri:oozie:email-action:0.1"> <to>${emailRecipients}</to> <subject>Oozie Workflow Run Error On abc-hist Workflow</subject> <body>Oozie workflow id: ${wf:id()}, run failed.Error Message: [ ${wf:errorMessage(wf:lastErrorNode())} ] </body> </email> <ok to="kill"/> <error to="kill"/> </action> Can you please help us out with this too !!!! Thanks
... View more
01-24-2016
01:40 PM
Hi, After upgrading to CDH 5.5.1 from CDH 5.4.2, our oozie workflows started failing, which used to wok in CDH 5.4.2. Now we are confused why this is happening. Please see the following error: FAILED: IllegalArgumentException java.net.URISyntaxException: Relative path in absolute URI: file:./tmp/yarn/83926d9d-6271-45de-9d44-9c2b12a649bc/hive_2016-01-24_20-34-24_987_916201278402533005-1
Intercepting System.exit(40000)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [40000] please need help asap!!!! Thanks!!!
... View more
Labels:
01-04-2016
11:30 AM
Hi, Thanks One more issue with SLA events monitoring is setting the nominal time in workflow, Our workflow kicks based on data in events. If the data is available then it will start. so how to set the nominal time of workflow to current time when the job kicks in if data is available. Please help!!!!
... View more
12-22-2015
09:50 AM
Hi I see in the documentation it says if we are using decision node action then this sla events will not work. As I am using decision actions in my workflow. Also can you tell me the configuration of SLA through Cloudera Manager. Thanks !!!!
... View more
12-21-2015
02:13 PM
Hi, I am also facing similar issue can you tell me where I need to add hive-contib.jar , We are using cdh5.4.2. Please let me know the exact path. Thanks!!
... View more
12-21-2015
10:52 AM
Hi, I want to know is there a way to find out oozie workflow start and end time, then comapre it with the specified time in minutes, and send me an alert email saying that your workflow took more than specified time. Please help!!!!
... View more
Labels:
- Labels:
-
Apache Oozie
12-09-2015
02:38 PM
Hi, We have set the HA proxy load balancer with impala daemons recently and trying to run some queries through our application using JDBC connection. Its working fine, but sometimes it throws us error like following-------------- But when I run the query individually aganist each Impala daemon it works fine, with no issue and is able to find the table. The config of HA proxy includes hostnames where all IMPALA DaEMONS are residing. so no issue here, but not understanding why its causing this error. Please help urgently !!!! ERROR: 1449694771535:get_manager_teams.xactionSQLBaseComponent.ERROR_0006 - Could not execute get_manager_teams.xaction java.sql.SQLException: [Simba][ImpalaJDBCDriver](500051) ERROR processing query/statement. Error Code: [Simba][JSQLEngine](12010) The table "xxxxx" could not be found., SQL state: HY000, Query: (xxxxxxxx----query resides here which is simple select) . at com.cloudera.impala.hivecommon.dataengine.HiveJDBCDataEngine.prepare(Unknown Source) at com.cloudera.impala.jdbc.common.SStatement.executeNoParams(Unknown Source) at com.cloudera.impala.jdbc.common.SStatement.executeQuery(Unknown Source) at com.mchange.v2.c3p0.impl.NewProxyStatement.executeQuery(NewProxyStatement.java:35) at org.pentaho.platform.plugin.services.connections.sql.SQLConnection.executeQuery(SQLConnection.java:375) at org.pentaho.platform.plugin.services.connections.sql.SQLConnection.executeQuery(SQLConnection.java:332) at org.pentaho.platform.plugin.action.sql.SQLBaseComponent.doQuery(SQLBaseComponent.java:649) at org.pentaho.platform.plugin.action.sql.SQLBaseComponent.runQuery(SQLBaseComponent.java:559) at org.pentaho.platform.plugin.action.sql.SQLBaseComponent.executeAction(SQLBaseComponent.java:257) at org.pentaho.platform.engine.services.solution.ComponentBase.execute(ComponentBase.java:463) at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeComponent(RuntimeContext.java:1293) at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeAction(RuntimeContext.java:1262) at org.pentaho.platform.engine.services.runtime.RuntimeContext.performActions(RuntimeContext.java:1161) at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeLoop(RuntimeContext.java:1105) at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeSequence(RuntimeContext.java:987) at org.pentaho.platform.engine.services.runtime.RuntimeContext.performActions(RuntimeContext.java:1154) at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeLoop(RuntimeContext.java:1105) at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeSequence(RuntimeContext.java:987) at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeSequence(RuntimeContext.java:897) at org.pentaho.platform.engine.services.solution.SolutionEngine.executeInternal(SolutionEngine.java:399) at org.pentaho.platform.engine.services.solution.SolutionEngine.execute(SolutionEngine.java:317) at org.pentaho.platform.engine.services.solution.SolutionEngine.execute(SolutionEngine.java:193) at org.pentaho.platform.engine.services.BaseRequestHandler.handleActionRequest(BaseRequestHandler.java:159) at org.pentaho.platform.engine.services.solution.SolutionHelper.doActionInternal(SolutionHelper.java:310) at org.pentaho.platform.engine.services.solution.SolutionHelper.doAction(SolutionHelper.java:295) at com.collectivei.rest.analyticaldataasservice.utils.CIServiceUtil.getXactionResultString(CIServiceUtil.java:284) at com.collectivei.rest.analyticaldataasservice.utils.CIServiceUtil.getXactionResultString(CIServiceUtil.java:269) at com.collectivei.rest.analyticaldataasservice.utils.CIServiceUtil.getManagerTeams(CIServiceUtil.java:1357) at com.collectivei.rest.analyticaldataasservice.utils.CIServiceUtil.getManagerTeams(CIServiceUtil.java:1352) at com.collectivei.rest.analyticaldataasservice.utils.ESUtils.loadDataToES(ESUtils.java:208) at com.collectivei.rest.analyticaldataasservice.utils.ESUtils.createESStoreAfterWinProbabilityNotification(ESUtils.java:122) at com.collectivei.rest.analyticaldataasservice.cache.CIWinProbabilityNotificationCacheBuilder.buildCache(CIWinProbabilityNotificationCacheBuilder.java:26) Caused by: com.cloudera.impala.support.exceptions.GeneralException: [Simba][ImpalaJDBCDriver](500051) ERROR processing query/statement. Error Code: [Simba][JSQLEngine](12010) The table "xxxxx" could not be found., SQL state: HY000, Query: . ... 32 more 2015-12-09 20:59:31,582 ERROR v2 48f985d3-e8d1-4a27-90f7-8bc1092f90f7 10.110.1.98 adas-a9bc9c7c-762f-4e40-956c-0d85f258aa69 [org.pentaho.platform.plugin.action.sql.SQLLookupRule] Error end:
... View more
Labels:
- Labels:
-
Apache Impala
12-01-2015
11:43 AM
Hi We are also facing the same issue of invalid file footer, the table is created as follows : 2 tables created CREATE EXTERNAL TABLE ABC_TEXT ( NAME STRING, ID INT, PHONE INT) PARTITION BY (Customer_id INT) ROW FORMAT DELIMITED FIELDS TERMINATED BY ';' STORED AS TEXTFILE LOCATION '/USER/ABC_TEXT ; CREATE EXTERNAL TABLE ABC_PARQUET ( NAME STRING, ID INT, PHONE INT ) PARTITION BY (Customer_id INT) STORED AS PARQUET LOCATION '/USER/ABC_PARQUET' ; Then run the insert script, which inserts data perfectly but when queried on parquet table getting following error Error: Caused by: java.sql.SQLException: [Simba][ImpalaJDBCDriver](500312) Error in fetching data rows: Invalid file footer Please let me know what I am doing wrong.
... View more
07-10-2015
03:08 PM
1 Kudo
Hi ,
I am getting following errorhi in cdh 5.4.2
FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations.
I followed the steps and limitations:
Following are my steps........
1. New Configuration Parameters for Transactions
2. Creates Hive table with ACID support
3. Load data into Hive table
4. Do UPDATE,DELETE and INSERT
set hive.support.concurrency=true; set hive.enforce.bucketing=true; set hive.exec.dynamic.partition.mode=nonstrict; set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager; set hive.compactor.initiator.on=true; set hive.compactor.worker.threads=2;
CREATE TABLE abc1 ( empwork_key int, empwork_id int, empwork__name string, empwork_email string, emp_wrk_phone string, CLUSTERED BY (empwork_id) into 2 buckets STORED AS ORC TBLPROPERTIES ('transactional' = 'true');
-- the data is inserted from an external table which is textfile format.
INSERT INTO TABLE abc1 SELECT empwork_key , empwork_id , empwork_name, empwork_email , emp_wrk_phone , FROM test.abc1 ;
update abc1 SET empwork_name = "Raj" where empwork_key = 70;
Please help if any suggestions or configuration changes needed.
I am setting all properties from hive shell
... View more
Labels:
- Labels:
-
Apache Hive
07-07-2015
12:05 PM
It works with the workaround, but it will be good if we can use tables and data from both impala and hive like we used to in our previous versions. It will avoid us from creating multiple similar tables, one for hive and another for impala. If it is fixed then its a great thing, so now we need to reinstall CDH 5.4.2 again to resolve this issue ?
... View more