Member since
01-08-2014
88
Posts
15
Kudos Received
11
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5782 | 10-29-2015 10:12 AM | |
5954 | 11-27-2014 11:02 AM | |
5841 | 11-03-2014 01:49 PM | |
3277 | 09-30-2014 11:26 AM | |
8514 | 09-21-2014 11:24 AM |
02-14-2014
11:42 AM
Hi Chris! Looks like you may have missed some steps from the installation instructions. Can you recreate these commands and paste the results in code blocks: [cloudera@localhost ~]$ ls -lah /usr/lib/accumulo
total 184K
drwxr-xr-x 12 accumulo accumulo 4.0K Feb 12 19:51 .
dr-xr-xr-x. 54 root root 4.0K Feb 7 06:54 ..
drwxr-xr-x 2 accumulo accumulo 4.0K Feb 7 06:55 bin
-rw-r--r-- 1 accumulo accumulo 26K Dec 18 11:32 CHANGES
drwxr-xr-x 3 accumulo accumulo 4.0K Feb 7 06:55 cloudera
drwxr-xr-x 3 accumulo accumulo 4.0K Feb 12 19:50 conf
drwxr-xr-x 2 accumulo accumulo 4.0K Feb 7 06:55 contrib
drwxr-xr-x 4 accumulo accumulo 4.0K Feb 7 06:55 docs
drwxr-xr-x 4 accumulo accumulo 4.0K Feb 12 19:50 lib
-rw-r--r-- 1 accumulo accumulo 56K Dec 18 11:32 LICENSE
drwx------ 2 accumulo accumulo 4.0K Feb 7 07:04 logs
-rw-r--r-- 1 accumulo accumulo 2.1K Dec 18 11:32 NOTICE
-rw-r--r-- 1 accumulo accumulo 31K Dec 18 11:32 pom.xml
-rw-r--r-- 1 accumulo accumulo 13K Dec 18 11:32 README
drwxr-xr-x 10 accumulo accumulo 4.0K Feb 7 06:55 src
drwxrwxr-x 2 accumulo accumulo 4.0K Feb 12 19:51 target
drwxr-xr-x 5 accumulo accumulo 4.0K Feb 7 06:55 test
lrwxrwxrwx 1 root root 11 Feb 7 08:42 walogs -> /dfs/walogs
[cloudera@localhost ~]$ ls -lah /usr/lib/accumulo/src/trace/
total 24K
drwxr-xr-x 4 accumulo accumulo 4.0K Feb 12 19:38 .
drwxr-xr-x 10 accumulo accumulo 4.0K Feb 7 06:55 ..
-rw-r--r-- 1 accumulo accumulo 1.8K Dec 18 11:32 pom.xml
drwxr-xr-x 4 accumulo accumulo 4.0K Feb 7 06:55 src
drwxrwxr-x 8 accumulo accumulo 4.0K Feb 12 19:38 target
-rwxr-xr-x 1 accumulo accumulo 1.8K Dec 18 11:32 thrift.sh
[cloudera@localhost ~]$ ls -lah /var/lib/accumulo
total 40K
drwxr-xr-x 5 accumulo accumulo 4.0K Feb 12 19:34 .
drwxr-xr-x. 45 root root 4.0K Feb 7 06:49 ..
drwxrwxr-x 2 accumulo accumulo 4.0K Feb 7 09:05 .accumulo
-rw------- 1 accumulo accumulo 824 Feb 7 08:54 .bash_history
-rw-r--r-- 1 accumulo accumulo 18 Feb 7 06:50 .bash_logout
-rw-r--r-- 1 accumulo accumulo 176 Feb 7 06:50 .bash_profile
-rw-r--r-- 1 accumulo accumulo 288 Feb 7 08:31 .bashrc
drwxrwxr-x 3 accumulo accumulo 4.0K Feb 12 19:34 .m2
drwx------ 2 accumulo accumulo 4.0K Feb 7 06:51 .ssh
-rw------- 1 accumulo accumulo 1.5K Feb 12 19:34 .viminfo
[cloudera@localhost ~]$
... View more
02-14-2014
09:18 AM
you should use the same flags I used in my example.
... View more
02-14-2014
06:39 AM
@crigano wrote: I tried mvn -Dhadoop.profile=2.0 package on command line, eclips and netbeans builds and still get the following errors, seems it can't find org.apache.hadoop:hadoop-core:jar:2.0.0-cdh4.3.0 : When building"examples-simple I got the following output:======================== Some dependency artifacts are not in your local repository: your project has dependencies that are not resolved locally. Code completion in the IDE will not include classes from these dependencies or their transitive dependencies (unless they are among the open projects). Please download the dependencies, or install them manually, if not available remotely. The artifacts are: org.apache.hadoop:hadoop-core:jar:2.0.0-cdh4.3.0 =================================================== Build however indicated success. This particular issue sounds like an IDE problem. Specifically, it looks like the IDE isn't reading repository settings out of the pom, so it doesn't know how to find the CDH jars. You should be able to edit the pom, get the relevant repository settings, and add them to the maven settings for hte IDE to fix part of the problem.
... View more
02-14-2014
06:23 AM
Hi Chris! Let's try to keep these threads on a single problem. It sounds like the compatibility issue that started this thread has been addressed. Could you mark it resolved? Your current problem sounds like it's the same one we started working on when ChunkInputFormatTest failed. Could you follow up on that question with a link to the output of running the maven commandline for building? If you use the rich text editor, there's a button for pasting in code that will help make the output more readable. In regards to the newer release, I've had no issues using it on the Quickstart VM. However, until we figure out the root cause of your problem there's little reason to change the Accumulo version. -Sean
... View more
02-12-2014
07:58 PM
Let's step back and confirm that your VM can properly see the necessary Maven repositories. I'm using a QuickStart VM with CDH 4.4.0 and I have the Accumulo 1.4.3-cdh4.3.0 release installed from tarballs in /usr/lib/accumulo. If I open up a terminal and navigate to the install directory, I can successfully rebuild everything including the examples. [cloudera@localhost ~]$ sudo su - accumulo
[accumulo@localhost ~]$ cd /usr/lib/accumulo
[accumulo@localhost accumulo]$ mvn -Dhadoop.profile=2.0 package
<...snip maven downloading dependencies from the internet...>
-------------------------------------------------------
T E S T S
-------------------------------------------------------
Running org.apache.accumulo.examples.simple.dirlist.CountTest
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Max depth : 3
Time to find max depth : 7 ms
Time to compute counts : 7 ms
Entries scanned : 30
Counts inserted : 4
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.302 sec
Running org.apache.accumulo.examples.simple.filedata.KeyUtilTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec
Running org.apache.accumulo.examples.simple.filedata.ChunkInputStreamTest
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.162 sec
Running org.apache.accumulo.examples.simple.filedata.ChunkInputFormatTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.373 sec
Running org.apache.accumulo.examples.simple.filedata.ChunkCombinerTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.037 sec
Results :
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0
[INFO]
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ examples-simple ---
[INFO] Building jar: /usr/lib/accumulo/lib/examples-simple-1.4.3-cdh4.3.0.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] accumulo .......................................... SUCCESS [2.171s]
[INFO] cloudtrace ........................................ SUCCESS [3.827s]
[INFO] accumulo-start .................................... SUCCESS [20.561s]
[INFO] accumulo-core ..................................... SUCCESS [59.326s]
[INFO] accumulo-server ................................... SUCCESS [1:50.049s]
[INFO] accumulo-examples ................................. SUCCESS [0.016s]
[INFO] examples-simple ................................... SUCCESS [7.369s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3:24.868s
[INFO] Finished at: Wed Feb 12 19:53:11 PST 2014
[INFO] Final Memory: 66M/168M
[INFO] ------------------------------------------------------------------------
[accumulo@localhost accumulo]$ Can you try doing the same via the commandline in your VM? If we see a failure there, it should be easier to chase down the cause.
... View more
02-12-2014
06:29 PM
Hi Chris! That definitely sounds like a problem with getting Netbeans to play nicely with Maven's dependency providing. To avoid the drag of these kinds of tooling issues, I generally recommend people stick with whatever workflow they're used to for Java development. Is there another IDE you are more familiar with? -Sean
... View more
02-12-2014
11:52 AM
Hi Chris! This particular problem is caused by attempting to build Accumulo in Hadoop 1 mode against Hadoop 2 (specifically CDH4). In the initial Cloudera integrated release of the Accumulo 1.4.x line, you still had to specify the Hadoop 2 profile via a Java system property. Command line maven would look like this: $> mvn -Dhadoop.profile=2.0 package I presume there is some way to specify this in the build options for netbeans, but I am not familiar with that particular IDE. Alternatively, you could install our latest release, Accumulo 1.4.4-cdh4.5.0 (available from archive), which defaults to building against the Hadoop 2 profile. It might also help to know why you're rebuilding the examples. From your other outstanding question, my guess is that you're using it as a starting point to learn about general development?
... View more
02-10-2014
10:08 PM
In this case, the warning is benign. It's caused by the inclusion of a symlink from a versionless file name to a versioned file name in the CDH4 hadoop client directory. Since they point to the same jar, you can safely ignore them. You can confirm this by doing a long listing on the directory mentioned in the error message: [accumulo@localhost accumulo]$ ls -lah /usr/lib/hadoop/client-0.20/slf4j-log4j* lrwxrwxrwx. 1 root root 43 Oct 7 08:33 /usr/lib/hadoop/client-0.20/slf4j-log4j12-1.6.1.jar -> /usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar lrwxrwxrwx. 1 root root 43 Oct 7 08:33 /usr/lib/hadoop/client-0.20/slf4j-log4j12.jar -> /usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar [accumulo@localhost accumulo]$ You can clear up the warning by modifying your accumulo-site.xml to use a regex in general.classpaths that will only match files with version numbers in the name: <property>
<name>general.classpaths</name>
<value>
$ACCUMULO_HOME/lib/[^.].$ACCUMULO_VERSION.jar,
$ACCUMULO_HOME/lib/[^.].*.jar,
$ZOOKEEPER_HOME/zookeeper[^.].*-[0-9].*.jar,
$HADOOP_CONF_DIR,
$HADOOP_CLIENT_HOME/[^.].*-[0-9].*.jar,
$HADOOP_MAPRED_HOME/[^.].*-[0-9].*.jar,
$HADOOP_MAPRED_HOME/lib/[^.].*.jar,
</value>
</property> Note in particular, the addition of "-[0-9].*" to the regex for matching jars in the hadoop client directory. It may help to view this change in the larger context of configuration files for working with the quickstart VM.
... View more
02-10-2014
09:57 PM
Hey Chris! Any luck getting the output of those two commands somewhere I can see it?
... View more
02-07-2014
01:03 PM
Hi Chris! It looks like you have some Accumulo jars in your path that don't work with Hadoop 2 (and thus CDH4). First, to make sure we're on the same page, earlier I walked through the following: Get the Cloudera QuickStart VM with CDH 4.4.0 Download the Cloudera release for Apache Accumulo 1.4.3-cdh4.3.0 Follow the instructions for installing above via tarball Walk through the example found in docs/README.helloworld I was able to do all of the above succesfully (after adjusting the installation classpath for using a CDH installation done with packages). Does this matches what you are trying to do? Presuming it does, let's step through what could be wrong with your install. Can you open a terminal on your quickstart VM and run the following: $ACCUMULO_HOME/bin/accumulo version $ACCUMULO_HOME/bin/accumulo classpath Please get the output of both of these commands and paste it into a gist/pastebin somewhere. For comparison, here is the output of those commands on my QuickStart VM.
... View more
- « Previous
- Next »