Member since
01-09-2016
70
Posts
30
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1865 | 01-27-2016 04:47 PM |
07-07-2016
07:21 PM
Is there any way where I can migrate the running (scheduled state) of 100s feed/process on Falcon to another cluster? Like copying falcon store, embedmq, and/or graph database.
... View more
Labels:
- Labels:
-
Apache Falcon
05-06-2016
06:06 PM
@Jonas Straub This seems to be a security risk on hdfs. Any user wihtout having sudo su can become superuser by: export HADOOP_USER_NAME=hdfs
... View more
04-21-2016
10:10 PM
@Benjamin Leonhardi I setup the Hiverserver2 authentication with PAM using all 1-5 steps but unfortunately still getting invalid login error while using beeline. I used AMD64 JPAM lib. My hiveserver2 java lib path is: -Djava.library.path=:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/hdp/2.3.4.0-3485/hadoop/lib/native In HDP 2.3, there is no path: /usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64 I copied the .so in /usr/hdp/2.3.4.0-3485/hadoop/lib/native.
... View more
04-15-2016
05:07 PM
I have the following : hive-shims-0.20S-1.2.1.2.3.4.0-3485.jar hive-shims-0.23-1.2.1.2.3.4.0-3485.jar hive-shims-1.2.1.2.3.4.0-3485.jar
... View more
04-04-2016
08:21 PM
1 Kudo
HDP 2.3 ------------------ I am getting this error intermittently. Sometime the Hive action in oozie works fine. Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], main() threw exception, org.apache.hadoop.hive.metastore.IMetaStoreClient.isLocalMetaStore()Z
java.lang.NoSuchMethodError: org.apache.hadoop.hive.metastore.IMetaStoreClient.isLocalMetaStore()Z
at org.apache.hadoop.hive.ql.session.SessionState.unCacheDataNucleusClassLoaders(SessionState.java:1474)
at org.apache.hadoop.hive.ql.session.SessionState.close(SessionState.java:1468)
at org.apache.hadoop.hive.cli.CliSessionState.close(CliSessionState.java:66)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:683)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:306)
at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:290)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:68)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Oozie Launcher failed, finishing Hadoop job gracefully
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Oozie
02-27-2016
05:01 AM
@Junichi Oda @Ali Bajwa @spolavarap Did you got the solution?. I am struggling a lot and not able to search users within group. Here are my settings. Only groups getting fetched but no user. If I remove User Search Filter, I am able to fetch all users including users from other groups. Username Attribute = uid User Object Class = inetOrgPerson User Search Base = zz.com User Search Filter = (memberof=cn=TEAM_EDL_Dev,ou=Groups,o=zz.com) User Search Scope = sub User Group Name Attribute = memberof,ismemberof Group Member Attribute = member Group Name Attribute = cn Group Object Class = groupOfNames Group Search Base = zz.com Group Search Filter = (|(cn=edl*)(cn=TEAM_EDL_Dev)
... View more
02-08-2016
04:53 PM
It would be more helpful, if I should know on what reasons this errors comes, so that I can pin point the problem. Or please give me some steps to debug and solution.
... View more
02-07-2016
08:42 PM
1 Kudo
HDP-2.3.2.0-2950, Ambari 2.1, Hive: 1.2.1.2.3 I am facing problem in connecting beeline with secured ldap. Here are my testing, setups and errors: Secured LDAP Conenctivity Testing (working):
ping -c1 xxxx.net telnet xxxx.net 636 Setup: /etc/openldap/ldap.conf : TLS_CACERTDIR /usr/jdk64/jdk1.7.0_67/jre/lib/security Certifictae Type: CA certificate keytool -import -trustcacerts -alias xxxx -storepass changeit -noprompt -file 6a386909.0 -keystore /usr/jdk64/jdk1.7.0_67/jre/lib/security/cacerts (Certificate imported) Works Fine: ldapsearch -x -W -D 'uid=abc@xx.com,ou=People,o=xx.com' -H ldaps://xxxx.net:636 -b o=xx.com "(uid=abc@xx.com)" HDFS env Setup: export HADOOP_OPTS="-Djava_net_preferIPv4Stack=true
-Djavax.net.ssl.trustStore=/usr/jdk64/jdk1.7.0_67/jre/lib/security/cacerts
-Djavax.net.ssl.trustStorePassword=changeit ${HADOOP_OPTS}" Hive Advance Setup: hive.server2.authentication.ldap.baseDN : CN=%s,uid=%s,OU=People,O=xx.com
hive.server2.authentication.ldap.url : ldaps://xxxx.net beeline> !connect jdbc:hive2://<myhiveserver2-host>:10000 Connecting to jdbc:hive2://<myhiveserver2-host>:10000 Enter username for jdbc:hive2://<myhiveserver2-host>:10000: abc@xx.com (TRIED with just abc also) Enter password for jdbc:hive2://<myhiveserver2-host>:10000: ********* Error: Could not open client transport with JDBC Uri: jdbc:hive2://<myhiveserver2-host>:10000: Peer indicated failure: Error validating the login (state=08S01,code=0)
0: jdbc:hive2://<myhiveserver2-host>:100 (closed)>
hiveserver2.log ------------------
2016-02-07 20:06:07,764 ERROR [HiveServer2-Handler-Pool: Thread-47]: transport.TSaslTransport (TSaslTransport.java:open(315)) - SASL negotiation failure
javax.security.sasl.SaslException: Error validating the login [Caused by javax.security.sasl.AuthenticationException: Error validating LDAP user [Caused by javax.naming.AuthenticationException: [LDAP: error code 49 - Invalid Credentials]]]
at org.apache.hive.service.auth.PlainSaslServer.evaluateResponse(PlainSaslServer.java:109)
at org.apache.thrift.transport.TSaslTransport$SaslParticipant.evaluateChallengeOrResponse(TSaslTransport.java:539)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:283)
at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: javax.security.sasl.AuthenticationException: Error validating LDAP user [Caused by javax.naming.AuthenticationException: [LDAP: error code 49 - Invalid Credentials]]
at org.apache.hive.service.auth.LdapAuthenticationProviderImpl.Authenticate(LdapAuthenticationProviderImpl.java:77)
at org.apache.hive.service.auth.PlainSaslHelper$PlainServerCallbackHandler.handle(PlainSaslHelper.java:106)
at org.apache.hive.service.auth.PlainSaslServer.evaluateResponse(PlainSaslServer.java:102)
... 8 more
... View more
Labels:
- Labels:
-
Apache Hive
02-05-2016
01:40 AM
1 Kudo
HDP 2.3, Ambari 2.1 Adding 'misfire_grace_time':10 to APS_CONFIG in /usr/lib/python2.6/site-packages/ambari_agent/AlertSchedulerHandler.py on every node and restarting ambari server and agent on all nodes didn't worked for me.
... View more
01-27-2016
06:11 PM
In HDP 2.3, Ambari 2.1, I did't find hive.server2.authentication.ldap.Domain
... View more
01-27-2016
04:47 PM
The issue has been resolved. The entity was corrupt and we restored the falcon Store and WebApp.
... View more
01-27-2016
01:21 AM
Please see the log in my previous response. The output of ps -ef | grep falcon falcon 13996 1 3 01:00 ? 00:00:31 /usr/jdk64/jdk1.7.0_67/bin/java -Xmx1024m -noverify -Dfalcon.embeddedmq=True -Dfalcon.emeddedmq.port=61616 -Dfalcon.log.dir=/var/log/falcon -Dfalcon.embeddedmq.data=/hadoop/falcon/embeddedmq/data -Dfalcon.home=/usr/hdp/current/falcon-server -Dconfig.location=/usr/hdp/current/falcon-server/conf -Dfalcon.app.type=falcon -Dfalcon.catalog.service.enabled= -cp /usr/hdp/current/falcon-server/conf:/usr/hdp/2.3.2.0-2950/hadoop/conf:/usr/hdp/2.3.2.0-2950/hadoop/lib/*:/usr/hdp/2.3.2.0-2950/hadoop/.//*:/usr/hdp/2.3.2.0-2950/hadoop-hdfs/./:/usr/hdp/2.3.2.0-2950/hadoop-hdfs/lib/*:/usr/hdp/2.3.2.0-2950/hadoop-hdfs/.//*:/usr/hdp/2.3.2.0-2950/hadoop-yarn/lib/*:/usr/hdp/2.3.2.0-2950/hadoop-yarn/.//*:/usr/hdp/2.3.2.0-2950/hadoop-mapreduce/lib/*:/usr/hdp/2.3.2.0-2950/hadoop-mapreduce/.//*:::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/classes:/usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/*:/usr/hdp/current/falcon-server/libext/* org.apache.falcon.Main -app /usr/hdp/current/falcon-server/server/webapp/falcon -port 15000 On Ambari, there is not error on Falcon Server. But when I tried to run Falcon UI, I am getting: HTTP ERROR: 503 Problem accessing /index.html. Reason: SERVICE_UNAVAILABLE Please Note: I need to Update pig-action.xml to have hive in the share lib config and repackage
falcon-oozie-adaptor-0.6.1.2.3.2.0-2950.jar and replace jar at
"/usr/hdp/current/falcon-server/webapp/falcon/WEB-INF/lib” and restart
Falcon.
... View more
01-27-2016
01:04 AM
Falcon: 0.6.1.2.3, HDP: 2.3.2.0-2950, Ambari: 2.1.2.1 Which log I need to provide? Extract from application log: 2016-01-27 01:00:50,873 ERROR - [main:] ~ Failed to initialize service org.apache.falcon.entity.store.ConfigurationStore (ServiceInitializer:49)
org.apache.falcon.FalconException: Unable to restore configurations for entity type PROCESS
at org.apache.falcon.entity.store.ConfigurationStore.loadEntity(ConfigurationStore.java:189)
at org.apache.falcon.entity.store.ConfigurationStore.init(ConfigurationStore.java:152)
at org.apache.falcon.service.ServiceInitializer.initialize(ServiceInitializer.java:47)
at org.apache.falcon.listener.ContextStartupListener.contextInitialized(ContextStartupListener.java:56)
at org.mortbay.jetty.handler.ContextHandler.startContext(ContextHandler.java:550)
at org.mortbay.jetty.servlet.Context.startContext(Context.java:136)
at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1282)
at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:519)
at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:499)
at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
at org.mortbay.jetty.Server.doStart(Server.java:224)
at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
at org.apache.falcon.util.EmbeddedServer.start(EmbeddedServer.java:57)
at org.apache.falcon.Main.main(Main.java:83)
2016-01-27 01:00:50,875 ERROR - [main:] ~ Failed startup of context org.mortbay.jetty.webapp.WebAppContext@4759d881{/,/usr/hdp/current/falcon-server/server/webapp/falcon} (log:87)
java.lang.RuntimeException: org.apache.falcon.FalconException: org.apache.falcon.FalconException: Unable to restore configurations for entity type PROCESS
at org.apache.falcon.listener.ContextStartupListener.contextInitialized(ContextStartupListener.java:59)
at org.mortbay.jetty.handler.ContextHandler.startContext(ContextHandler.java:550)
at org.mortbay.jetty.servlet.Context.startContext(Context.java:136)
at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1282)
at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:519)
at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:499)
at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
at org.mortbay.jetty.Server.doStart(Server.java:224)
at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
at org.apache.falcon.util.EmbeddedServer.start(EmbeddedServer.java:57)
at org.apache.falcon.Main.main(Main.java:83)
Caused by: org.apache.falcon.FalconException: org.apache.falcon.FalconException: Unable to restore configurations for entity type PROCESS
at org.apache.falcon.service.ServiceInitializer.initialize(ServiceInitializer.java:50)
at org.apache.falcon.listener.ContextStartupListener.contextInitialized(ContextStartupListener.java:56)
... 11 more
Caused by: org.apache.falcon.FalconException: Unable to restore configurations for entity type PROCESS
at org.apache.falcon.entity.store.ConfigurationStore.loadEntity(ConfigurationStore.java:189)
at org.apache.falcon.entity.store.ConfigurationStore.init(ConfigurationStore.java:152)
at org.apache.falcon.service.ServiceInitializer.initialize(ServiceInitializer.java:47)
... 12 more
2016-01-27 01:00:50,879 INFO - [main:] ~ Started SocketConnector@0.0.0.0:15000 (log:67)
... View more
01-27-2016
12:25 AM
I restarted Falcon Server and getting this error: HTTP ERROR: 503
Problem accessing /index.html. Reason:
SERVICE_UNAVAILABLE
... View more
Labels:
- Labels:
-
Apache Falcon
01-22-2016
05:01 PM
It works fine when we made changes in workflow.xml for oozie 4.2. But when we are running Falcon process and then system will create the oozie workflow. At PIG action, it got failed with ShimsLoader error. Actually falcon is not picking changes made in oozie-site.xml: oozie.action.sharelib.for.pig=pig,hcatalog,hive I also tried restarting Falcon, OOzie, Hive bit still the not picking up the above property.
... View more
01-22-2016
01:17 AM
Please let me know which one I need to set: oozie.action.sharelib.for.pig=pig,hcatalog oozie.action.sharelib.for.pig=hive,pig,hcatalog
oozie.action.sharelib.for.hive=hive,hcatalog,sqoop Also where I need to set: In Job Config or Hive-site.xml as custom property.
... View more
01-21-2016
11:27 PM
Hi I am trying to run a PIG action in oozie workflow and getting this error: Pig Stack Trace
---------------
ERROR 2998: Unhandled internal error. org/apache/hadoop/hive/shims/ShimLoader
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/shims/ShimLoader
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:368)
at org.apache.hive.hcatalog.pig.PigHCatUtil.getHCatServerUri(PigHCatUtil.java:134)
at org.apache.hive.hcatalog.pig.HCatLoader.getSchema(HCatLoader.java:217)
at org.apache.pig.newplan.logical.relational.LOLoad.getSchemaFromMetaData(LOLoad.java:175)
at org.apache.pig.newplan.logical.relational.LOLoad.<init>(LOLoad.java:89)
at org.apache.pig.parser.LogicalPlanBuilder.buildLoadOp(LogicalPlanBuilder.java:901)
at org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3568)
at org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1625)
at org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:1102)
at org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:560)
at org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:421)
at org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:191)
at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1735)
at org.apache.pig.PigServer$Graph.access$000(PigServer.java:1443)
at org.apache.pig.PigServer.parseAndBuild(PigServer.java:387)
at org.apache.pig.PigServer.executeBatch(PigServer.java:412)
at org.apache.pig.PigServer.executeBatch(PigServer.java:398)
at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:171)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:234)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:81)
at org.apache.pig.Main.run(Main.java:502)
at org.apache.pig.PigRunner.run(PigRunner.java:49)
at org.apache.oozie.action.hadoop.PigMain.runPigJob(PigMain.java:288)
at org.apache.oozie.action.hadoop.PigMain.run(PigMain.java:231)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
at org.apache.oozie.action.hadoop.PigMain.main(PigMain.java:76)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.shims.ShimLoader
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 40 more
================================================================================
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.PigMain], exit code [2]
... View more
Labels:
01-20-2016
06:19 PM
1 Kudo
I am facing authenticating ldap user. The following command results in: ldap_bind: Invalid credentials (49) ldapsearch -x -H ldaps://my-ldap-server.net -b "ou=People,o=xx.com" "(uid=xx.xxx@xx.com)" -W But without -W (without password), it is working fine and search the record.
... View more
Labels:
- Labels:
-
Apache Ambari
01-11-2016
04:24 PM
1 Kudo
source type is setup by default to 'solr'. I changed it to 'db' and also, remove the filter. And it works!!!!. The Solr os a preferabl and the recommended audit source type but unfortunately it is not included in the Ambari 2.1 stack. Thanks for the support.
... View more
01-10-2016
07:36 PM
1 Kudo
HDFS ranger plugin audit is getting logged in Ranger_Audit DB. Yes,HDFS resource access/denied audit is being audited and recorded in MySQL ranger_audit DB. We have verified from the xa_access.. table. Yes,the ranger.audit.source.type is set to db in advance-ranger-admin-site. Still the audit page is not displaying any transactions.
... View more
01-09-2016
10:27 PM
3 Kudos
We have HDP 2.3 with Ranger. Audit data store is MySQL DB. The audit transactions are getting stored in ranger_audit db. But the Ranger Audit page on ranger admin portal did not showing any record. We are also trying to store log to HDFS/ranger/audit folder, but these also not getting recorded.
... View more
Labels:
- Labels:
-
Apache Ranger