Member since
02-16-2016
45
Posts
24
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3806 | 07-28-2016 03:37 PM | |
5739 | 02-20-2016 11:34 PM |
02-08-2019
02:34 PM
@Geoffrey Shelton Okot No I still get the same error.
... View more
02-07-2019
10:47 PM
We are using SASL and Kerberos not SSL. Do you have any functioning SASL and Kerberos config?
... View more
02-07-2019
09:52 PM
@Geoffrey Shelton Okot It is a HDP cluster version: 2.6.5.4-1. I have a Kafka cluster with 6 brokers. listeners=SASL_PLAINTEXT://host.name:6667
advertised.listeners=SASL_PLAINTEXT://host.name:6667
sasl.enabled.mechanisms=GSSAPI
I do not see "sasl.kerberos.service.name" in server.properties I do see it in kafka_jaas.conf and kafka_client_jaas.conf being set to 'kafka'
... View more
02-07-2019
09:14 PM
@Geoffrey Shelton Okot Yes the Kafka cluster is secured with SASL and Kerberos. We just did this so it is the first time we are testing it. We followed Hortonwork's documentation to secure the cluster.
... View more
02-07-2019
08:39 PM
@Geoffrey Shelton Okot in the server.properties I see listeners=SASL_PLAINTEXT://host.name:6667
advertised.listeners=SASL_PLAINTEXT://host.name:6667 Do I need to change them? The cluster is secured and we are using SASL_PLAINTEXT not PLAINTEXT
... View more
02-07-2019
08:04 PM
I followed Producing Events/Messages to Kafka on a Secured Cluster. I am setting export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf" and passing --security-protocol SASL_PLAINTEXT my command looks like ./bin/kafka-console-producer.sh --broker-list <Brokker-hosts>:6667 --topic test --security-protocol SASL_PLAINTEXT
kafka_client_jaas.conf: KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=true
renewTicket=true
serviceName="kafka";
};
kafka_jaas.conf: KafkaServer {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/etc/security/keytabs/kafka.service.keytab"
storeKey=true
useTicketCache=false
serviceName="kafka"
principal="kafka/_host@EXAMPLE.COM";
};
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=true
renewTicket=true
serviceName="kafka";
};
Client {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/etc/security/keytabs/kafka.service.keytab"
storeKey=true
useTicketCache=false
serviceName="zookeeper"
principal="kafka/_host@EXAMPLE.COM";
};
com.sun.security.jgss.krb5.initiate {
com.sun.security.auth.module.Krb5LoginModule required
renewTGT=false
doNotPrompt=true
useKeyTab=true
keyTab="/etc/security/keytabs/kafka.service.keytab"
storeKey=true
useTicketCache=false
serviceName="kafka"
principal="kafka/_host@EXAMPLE.COM";
};
When I run this I get the prompt to type my message but then I get: 19/02/07 13:35:52 WARN NetworkClient: Error while fetching metadata with correlation id 307 : {test=LEADER_NOT_AVAILABLE}
19/02/07 13:35:52 WARN NetworkClient: Error while fetching metadata with correlation id 308 : {test=LEADER_NOT_AVAILABLE}
19/02/07 13:35:52 WARN NetworkClient: Error while fetching metadata with correlation id 309 : {test=LEADER_NOT_AVAILABLE}
19/02/07 13:35:52 WARN NetworkClient: Error while fetching metadata with correlation id 310 : {test=LEADER_NOT_AVAILABLE}
19/02/07 13:35:52 WARN NetworkClient: Error while fetching metadata with correlation id 311 : {test=LEADER_NOT_AVAILABLE}
my Kafka version is : 1.0.0 I made sure that topic "test" exists and I can get the leader ids when I run describe How can I resolve this issue?
... View more
Labels:
- Labels:
-
Apache Kafka
07-28-2016
03:48 PM
I solved it by coping all the jar file in /usr/hdp/2.3.2.0-2950/atlas/hook/hive/* directory into lib folder at job.properties level.
... View more
07-28-2016
03:37 PM
I figure it out. I post the answer for others with same issue. The problem was missing atlas jar files.Try to copy all the jar file in /usr/hdp/2.3.2.0-2950/atlas/hook/hive/* directory into lib folder at job.properties level.
... View more
07-28-2016
03:36 PM
@Thiago I did! It took me one week but finally I figure it out! I will post it as an answer.
... View more
07-06-2016
05:01 PM
@mqureshi I add the jar file but I still get the same error. Do you have any other suggestion?
... View more
06-30-2016
08:47 PM
I do not have jdbc. jar file in the /user/oozie/share/lib/lib_20151027144433 where can I download the right jar file?
... View more
06-30-2016
08:23 PM
@mqureshi Thank you for your response. By adding hadoop.proxyuser.hive.groups=* I solved that error. However, Now I am getting the new one. I post the new error and log in here https://community.hortonworks.com/questions/42720/main-class-orgapacheoozieactionhadoophivemain-exit.html Can you look at it and let me know if you can fix it?
... View more
06-30-2016
08:13 PM
@Manish GuptaThe weird thing is it creates the table in hive then show this error. I add the complete log in the question.
... View more
06-30-2016
06:36 PM
When I try to run a hive action I get error number E0729 Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [12] with nothing more in log file. How fix this? log: 2016-06-30 18:33:37,930 INFO ActionStartXCommand:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@:start:] Start action [0000011-160630143249353-oozie-oozi-W@:start:] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2016-06-30 18:33:37,934 INFO ActionStartXCommand:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@:start:] [***0000011-160630143249353-oozie-oozi-W@:start:***]Action status=DONE
2016-06-30 18:33:37,934 INFO ActionStartXCommand:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@:start:] [***0000011-160630143249353-oozie-oozi-W@:start:***]Action updated in DB!
2016-06-30 18:33:38,013 INFO WorkflowNotificationXCommand:520 - SERVER[sandbox.hortonworks.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[] No Notification URL is defined. Therefore nothing to notify for job 0000011-160630143249353-oozie-oozi-W
2016-06-30 18:33:38,014 INFO WorkflowNotificationXCommand:520 - SERVER[sandbox.hortonworks.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@:start:] No Notification URL is defined. Therefore nothing to notify for job 0000011-160630143249353-oozie-oozi-W@:start:
2016-06-30 18:33:38,042 INFO ActionStartXCommand:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@hive-node] Start action [0000011-160630143249353-oozie-oozi-W@hive-node] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2016-06-30 18:33:49,999 INFO HiveActionExecutor:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@hive-node] checking action, hadoop job ID [job_1467297249897_0020] status [RUNNING]
2016-06-30 18:33:50,002 INFO ActionStartXCommand:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@hive-node] [***0000011-160630143249353-oozie-oozi-W@hive-node***]Action status=RUNNING
2016-06-30 18:33:50,002 INFO ActionStartXCommand:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@hive-node] [***0000011-160630143249353-oozie-oozi-W@hive-node***]Action updated in DB!
2016-06-30 18:33:50,017 INFO WorkflowNotificationXCommand:520 - SERVER[sandbox.hortonworks.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@hive-node] No Notification URL is defined. Therefore nothing to notify for job 0000011-160630143249353-oozie-oozi-W@hive-node
2016-06-30 18:38:09,496 INFO HiveActionExecutor:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@hive-node] action completed, external ID [job_1467297249897_0020]
2016-06-30 18:38:09,783 WARN HiveActionExecutor:523 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@hive-node] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [12]
2016-06-30 18:38:10,580 INFO ActionEndXCommand:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@hive-node] ERROR is considered as FAILED for SLA
2016-06-30 18:38:10,984 INFO ActionStartXCommand:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@fail] Start action [0000011-160630143249353-oozie-oozi-W@fail] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2016-06-30 18:38:11,002 INFO ActionStartXCommand:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@fail] [***0000011-160630143249353-oozie-oozi-W@fail***]Action status=DONE
2016-06-30 18:38:11,003 INFO ActionStartXCommand:520 - SERVER[sandbox.hortonworks.com] USER[ambari-qa] GROUP[-] TOKEN[] APP[hive-wf] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@fail] [***0000011-160630143249353-oozie-oozi-W@fail***]Action updated in DB!
2016-06-30 18:38:11,506 INFO WorkflowNotificationXCommand:520 - SERVER[sandbox.hortonworks.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@fail] No Notification URL is defined. Therefore nothing to notify for job 0000011-160630143249353-oozie-oozi-W@fail
2016-06-30 18:38:11,506 INFO WorkflowNotificationXCommand:520 - SERVER[sandbox.hortonworks.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[] No Notification URL is defined. Therefore nothing to notify for job 0000011-160630143249353-oozie-oozi-W
2016-06-30 18:38:11,506 INFO WorkflowNotificationXCommand:520 - SERVER[sandbox.hortonworks.com] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000011-160630143249353-oozie-oozi-W] ACTION[0000011-160630143249353-oozie-oozi-W@hive-node] No Notification URL is defined. Therefore nothing to notify for job 0000011-160630143249353-oozie-oozi-W@hive-node job.properties nameNode=hdfs://sandbox.hortonworks.com:8020
jobTracker=sandbox.hortonworks.com:8050
queueName=default
examplesRoot=examples
oozie.use.system.libpath=true
oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/hive my Workflow is <workflow-app xmlns="uri:oozie:workflow:0.2" name="hive-wf">
<start to="hive-node"/>
<action name="hive-node">
<hive xmlns="uri:oozie:hive-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<prepare>
<delete path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data/hive"/>
<mkdir path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data"/>
</prepare>
<job-xml>hive-site.xml</job-xml>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<script>script.q</script>
<param>INPUT=/user/${wf:user()}/${examplesRoot}/input-data/table</param>
<param>OUTPUT=/user/${wf:user()}/${examplesRoot}/output-data/hive</param>
</hive>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Hive failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app>
... View more
Labels:
- Labels:
-
Apache Oozie
06-30-2016
05:45 PM
@Kuldeep Kulkarni Thank you. This solved that error but I get different error now. I get error Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [12] Do you have any idea how to fix this? I already add <job-xml>$path/hive-site.xml</job-xml> in my hive-site.xml
... View more
06-29-2016
08:19 PM
I started working with ooze. I am trying to run work flow examples in the examples folder. However, when I try to run a hive action I get error number E0729. In the log file the error is: org.apache.hive.service.cli.HiveSQLException: Failed to open new session: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hive is not allowed to impersonate anonymous at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:266) at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:202) at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:402) at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:297) at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1253) at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1238) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hive is not allowed to impersonate anonymous at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:83) at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) at com.sun.proxy.$Proxy28.open(Unknown Source) at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:258) ... 12 more Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hive is not allowed to impersonate anonymous at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:137) at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) ... 20 more my job.properties is nameNode=hdfs://sandbox.hortonworks.com:8020
jobTracker=sandbox.hortonworks.com:8050
queueName=default
examplesRoot=examples
oozie.use.system.libpath=true
oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/hive
... View more
Labels:
- Labels:
-
Apache Oozie
06-29-2016
03:57 PM
@mqureshiThank you for your response. History server was not running. When I turned it on from amber it fixed that error. Now I have different error. I am submitting a hive action it gives me error E0729. In the log file the error is : Do you know How fix this? org.apache.hive.service.cli.HiveSQLException: Failed to open new session: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hive is not allowed to impersonate anonymous
at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:266)
at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:202)
at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:402)
at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:297)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1253)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1238)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hive is not allowed to impersonate anonymous
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:83)
at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
at com.sun.proxy.$Proxy28.open(Unknown Source)
at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:258)
... 12 more
Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hive is not allowed to impersonate anonymous
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:137)
at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
... 20 more
... View more
06-28-2016
05:01 PM
@mqureshiI started it but, how to check is running? I started it with this command mr-jobhistory-daemon.sh start historyserver
... View more
06-28-2016
04:57 PM
@mqureshi I think I get connection refused. How fix it? authorization operation, Call From sandbox.hortonworks.com/192.168.60.128 to localhost:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
org.apache.oozie.servlet.XServletException: E0501: Could not perform authorization operation, Call From sandbox.hortonworks.com/192.168.60.128 to localhost:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at org.apache.oozie.servlet.BaseJobServlet.checkAuthorizationForApp(BaseJobServlet.java:271)
at org.apache.oozie.servlet.BaseJobsServlet.doPost(BaseJobsServlet.java:99)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at org.apache.oozie.servlet.JsonRestServlet.service(JsonRestServlet.java:304)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.oozie.servlet.AuthFilter$2.doFilter(AuthFilter.java:171)
at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:595)
at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:554)
at org.apache.oozie.servlet.AuthFilter.doFilter(AuthFilter.java:176)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.oozie.servlet.HostnameFilter.doFilter(HostnameFilter.java:86)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:620)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Thread.java:745)
... View more
06-28-2016
04:04 PM
I followed this link http://lecluster.delaurent.com/oozie-101-your-first-workflow-in-5-minutes/ to start with oozie. My job stays at Running state with the JA006 error code. I started my history server but, I still get the same error. How I can resolve this? I am using Oozie 4.2.0.2.3. This is my job.properties: nameNode=hdfs://sandbox.hortonworks.com:8020
jobTracker=sandbox.hortonworks.com:8050
queueName=default
examplesRoot=examples
oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/shell
... View more
Labels:
- Labels:
-
Apache Oozie
04-07-2016
07:59 PM
1 Kudo
I decided to use multiple streams instead. life is easier that way. thank u
... View more
04-07-2016
05:21 PM
yes the data is in the same stream. For example, one string will have 6 columns and the second one will have 8 . thank you I will try this see if it is gonna work.
... View more
04-07-2016
05:07 PM
thanks, it is really helpful for beginners like me.
... View more
04-07-2016
05:00 PM
In my data I have 8 different schemas. I want to create 8 different data frame for them and save them in 8 different tables in hive. So far I created a super bean class which holds shared attributes and each bean class extends it. Based on the type attribute I created different objects. The problem is I am unable to save them in different data frame. Is there any way I can do that? Here is my code so far, which works fine for one schema. xmlData.foreachRDD(
new Function2<JavaRDD<String>, Time, Void>() {
public Void call(JavaRDD<String> rdd, Time time) {
HiveContext hiveContext = JavaHiveContext.getInstance(rdd.context());
// Convert RDD[String] to RDD[case class] to DataFrame
JavaRDD<JavaRow> rowRDD = rdd.map(new Function<String, JavaRow>() {
public JavaRow call(String line) throws Exception{
String[] fields = line.split("\\|");
//JavaRow is my super class
JavaRow record = null;
if(fields[2].trim().equalsIgnoreCase("CDR")){
record = new GPRSClass(fields[0], fields[1]);
}
if(fields[2].trim().equalsIgnoreCase("Activation")){
record = new GbPdpContextActivation(fields[0], fields[1], fields[2], fields[3]); }
return record;}});
DataFrame df;
df = hiveContext.createDataFrame(rowRDD, JavaRow.class);
df.toDF().registerTempTable("Consumer");
System.out.println(df.count()+" ************Record Recived************");
df = hiveContext.createDataFrame(rowRDD, GPRSClass.class);
hiveContext.sql("CREATE TABLE if not exists gprs_data ( processor string, fileName string, type string, version string, id string )STORED AS ORC ");
df.save("/apps/hive/warehouse/data", "org.apache.spark.sql.hive.orc",SaveMode.Append);
}
return null; } });
... View more
Labels:
- Labels:
-
Apache Spark
04-01-2016
08:48 PM
Hue version has spark-submit. So there is not any way to do it in Huw 2.6? @Divakar Annapureddy
... View more
04-01-2016
08:25 PM
1 Kudo
I am new in Oozie. I am using Hue 2.6.1-2950 and Oozie 4.2. I develop a spark program in java which gets the data from kafka topic and save them in hive table. I pass my arguments to my .ksh script to submit the job. It works perfect however, I have no idea how to schedule this using oozie and hue to run every 5 minutes. I have a jar file which is my java code, I have a consumer.ksh which gets the arguments from my configuration file and run my jar file using spark-submit command. Please give me suggestion how to this.
... View more
Labels:
- Labels:
-
Apache Oozie
-
Apache Spark
04-01-2016
06:13 PM
2 Kudos
I am new in oozie. I have a java program which produce data into kafka topic(it is not map reduce job). I am trying to schedule it with ozzie. How ever, I am getting this error: JA009: Could not load history file hdfs://sandbox.hortonworks.com:8020/mr-history/tmp/hue/job_1459358290769_0012-1459533575025-hue-oozie%3Alauncher%3AT%3Djava%3AW%3DData+Producer%3AA%3DproduceDat-1459533591693-1-0-SUCCEEDED-default-1459533581542.jhist at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:349) at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.<init>(CompletedJob.java:101) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager$HistoryFileInfo. I read it can be permission or owner problem so, I changed the owner to mapred and give 777 permission. But I still I get the same error. I am using java action to schedule my jar file.
... View more
Labels:
- Labels:
-
Apache Oozie
-
Apache Spark
03-25-2016
03:45 PM
@Brandon Wilson I tried your suggestion it creates the hive table but I get this error: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table. and it does not load data into my table. do you have any idea how to solve this?
... View more
03-23-2016
02:30 PM
@Benjamin Leonhardi Thank you for your response. Based on your suggestion, I have to apply mapPartitions method on my JavaDStream . That method will return another JavaDStream to me. I cannot use saveAsTextFile() on the JavaDStream so I have to do foreachRDD to be able to do saveAsTextFile. Therefore, I will have the same problem again. correct me if I am wrong because I am new in spark.
... View more
03-21-2016
07:59 PM
2 Kudos
@Benjamin Leonhardi Do you have any sample code for java which use the mapPartition instead of foreachRDD ?
... View more