Created on 12-13-2014 01:28 PM - edited 09-16-2022 02:15 AM
I am trying to install accumulo 1.6 in the current quickstart cdh5.2.x distribution (downloaded 12/13/2014) to train folks.
I am suprised it is not pre-installed.
I want to be able to compile and run accumulo client programs in java and get to apprpriate java docs for accumulo.
I tried the instructions at http://www.cloudera.com/content/cloudera/en/documentation/Accumulo/latest/PDF/Apache-Accumulo-Instal...
Install Apache Accumulo from Distribution Packages
This section describes how to install Cloudera's packaging of Accumulo from packages (RPM or DEB)
instead of using Cloudera Manager.
I am not sure to use rpms, repo, .targz.
I find the instructions somewhat confusing and would appreciate your assistance.
thanks in advance.
Created 12-16-2014 09:09 AM
Created 12-16-2014 09:09 AM
Created 12-20-2014 04:37 PM
If you could set it up to do so with both Cloudera and Spark QuickStart with yum to install 1.6 that would be awsome. I spend so much time gettign things installed rather than concentrating on solving algorithmic issues. You would so rock!!!!
thanks!
Created 12-20-2014 04:38 PM
Ps HortonWorks already does the yum accumulo, but I use cloudera products.
Created 12-20-2014 06:19 PM
Created 06-03-2016 08:27 AM
I tried the instruction above however it looks like I am running into classpath issues.
[cloudera@quickstart ~]$ sudo service accumulo-master init
NOTE: it is strongly recommended that you override the following defaults in /etc/accumulo/conf/accumulo-site.xml:
Set logger.dir.walog to a directory on a partition with sufficient space for write-ahead logs
Set tracer.user and tracer.password to values of your choosing
Initializing Accumulo: [ OK ]
Thread "init" died org/apache/htrace/core/Tracer$Builder
java.lang.NoClassDefFoundError: org/apache/htrace/core/Tracer$Builder
at org.apache.hadoop.fs.FsTracer.get(FsTracer.java:42)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2683)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2733)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2715)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:382)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:181)
at org.apache.accumulo.core.volume.VolumeConfiguration.getConfiguredBaseDir(VolumeConfiguration.java:74)
at org.apache.accumulo.core.volume.VolumeConfiguration.getVolumeUris(VolumeConfiguration.java:96)
at org.apache.accumulo.server.fs.VolumeManagerImpl.get(VolumeManagerImpl.java:407)
at org.apache.accumulo.server.init.Initialize.main(Initialize.java:614)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.accumulo.start.Main$1.run(Main.java:141)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.htrace.core.Tracer$Builder
Not sure why I am getting class errors for htrace, didn't think that was required for Accumulo? Or is that because the CDH repos build accumulo with htrace in the class path?
Any ideas?
Created 07-08-2016 09:53 AM
Hi, did you ever get past this error? we're getting the exact same thing.
Created 11-14-2016 12:52 AM
Worked perfectly
Created 03-13-2017 12:15 PM
yum install accumulo hasn't worked well.
It prompts that "You need to be root to perform this action"