- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
CDH 5 fails to upload Oozie ShareLib on Ubuntu 12.04
Created on ‎08-27-2014 12:36 AM - edited ‎09-16-2022 02:06 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm trying to install CDH 5 to a cluster I have, but it fails during installation with Cloudera Manager.
Specifically it's the installation of Oozie that are causing issues. The installer stops at the following step:
Failed to upload Oozie ShareLib.
Stderr
Failed to upload Oozie ShareLib.
Program: oozie/oozie.sh ["install-sharelib","oozie-sharelib-yarn.tar.gz","hdfs://SKP-Cluster-1L:8020"]
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980) at org.apache.hadoop.ipc.Client.call(Client.java:1409) at org.apache.hadoop.ipc.Client.call(Client.java:1362) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) at com.sun.proxy.$Proxy9.addBlock(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy9.addBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:362) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1438) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1260) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:525) --------------------------------------
Stdout
Failed to upload Oozie ShareLib.
Program: oozie/oozie.sh ["install-sharelib","oozie-sharelib-yarn.tar.gz","hdfs://SKP-Cluster-1L:8020"]
Wed Aug 27 09:11:57 CEST 2014 JAVA_HOME=/usr/lib/jvm/java-7-oracle-cloudera using 5 as CDH_VERSION using /var/lib/oozie/tomcat-deployment as CATALINA_BASE the destination path for sharelib is: /user/oozie/share/lib/lib_20140827091200
What could be causing this? And more important; how can I fix this?
Created ‎09-04-2014 12:23 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I ended up ditching Ubuntu and moved to CentOS instead. It solved my problem.
Created ‎08-27-2014 01:55 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Cloudera Manager view. Toward the end of those logs you will see the
reasons logged, please paste them here and we can work to fix the problem
Gautam Gopalakrishnan
Created on ‎08-27-2014 02:18 AM - edited ‎08-27-2014 02:27 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
So, when using "Links to full logs" it gives me this:
HTTP ERROR 502
Problem accessing /cmf/process/73/logs. Reason:
Connection refused Could not connect to host.
Powered by Jetty://
So I went in manually and did cat /var/log/oozie-cmf-oozie-OOZIE_SERVER-SKP-Cluster-1L.log.out and it gave me this:
Created ‎09-04-2014 12:23 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I ended up ditching Ubuntu and moved to CentOS instead. It solved my problem.
Created ‎09-04-2014 03:49 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Instead of creating a new thread i will continue on this one because im facing the same issue although I am on CentOS 6.
I performed a brand new installation with the CDH-5.1.2-1.cdh5.1.2.p0.3 Parcels, and a installation of the Cloudera Core packages.
I also recieved the same error in the oozie_server log like Steen P.
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/oozie/share/lib/lib_20140904113544/oozie/json-simple-1.1.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation.
Although the error is clear about the issue, im receiving this while performing a clean install.
Are you able to provide a fix or workarround for this?
Thanks.
Patrick
Created ‎09-04-2014 06:11 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This weird error seemed to be caused by the fact the FQDN of the hosts where resolving to 127.0.0.1 aswell, which caused Hadoop to not start properly and the hadoop client unable to connect and upload the lib.
After rechecking all settings I was able to properly set up the parcel on the cluster.
Created ‎11-10-2014 07:33 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Could you please provide a few details about the 127.0.0.1 issue? I'm seeing a similar problem in 5.2.0 and 5.1.3
Thanks Will Hanson
