Member since
03-19-2019
11
Posts
1
Kudos Received
0
Solutions
04-02-2019
08:49 AM
Thank you, indeed it was an internet issue, changing the dns resolver in "/etc/resolv.conf" worked. default was 127.0.0.11, but this kept spitting errors, changing it to 8.8.8.8 worked. not sure why though.
... View more
04-01-2019
08:43 PM
1 Kudo
I found a workaround by installing it locally: I downloaded the JARs from Maven: https://mvnrepository.com/artifact/org.apache.zeppelin/zeppelin-shell/0.8.0 and passed it as argument artifact. This worked: # /usr/hdp/current/zeppelin-server/bin/install-interpreter.sh --name shell --artifact /var/tmp/zeppelin-jar/zeppelin-shell-0.8.0.jar
... View more
04-01-2019
08:43 PM
While trying to install the shell interpreter for zeppelin (v 0.8.0), I fail with this message: # sudo /usr/hdp/current/zeppelin-server/bin/install-interpreter.sh --name shell
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/zeppelin/lib/interpreter/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/zeppelin/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/zeppelin/lib/slf4j-simple-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Install shell(org.apache.zeppelin:zeppelin-shell:0.8.0) to /usr/hdp/current/zeppelin-server/interpreter/shell ...
org.sonatype.aether.RepositoryException: Cannot fetch dependencies for org.apache.zeppelin:zeppelin-shell:0.8.0
at org.apache.zeppelin.dep.DependencyResolver.getArtifactsWithDep(DependencyResolver.java:179)
at org.apache.zeppelin.dep.DependencyResolver.loadFromMvn(DependencyResolver.java:128)
at org.apache.zeppelin.dep.DependencyResolver.load(DependencyResolver.java:76)
at org.apache.zeppelin.dep.DependencyResolver.load(DependencyResolver.java:93)
at org.apache.zeppelin.dep.DependencyResolver.load(DependencyResolver.java:85)
at org.apache.zeppelin.interpreter.install.InstallInterpreter.install(InstallInterpreter.java:170)
at org.apache.zeppelin.interpreter.install.InstallInterpreter.install(InstallInterpreter.java:134)
at org.apache.zeppelin.interpreter.install.InstallInterpreter.install(InstallInterpreter.java:126)
at org.apache.zeppelin.interpreter.install.InstallInterpreter.main(InstallInterpreter.java:278)
Caused by: java.lang.NullPointerException
at org.sonatype.aether.impl.internal.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:352)
at org.apache.zeppelin.dep.DependencyResolver.getArtifactsWithDep(DependencyResolver.java:176)
... 8 more I use HDP 3.0.1
... View more
Labels:
- Labels:
-
Apache Zeppelin
04-01-2019
08:40 AM
I had this error too. Only solution I found was to shut down and reboot. Apparently sometimes when you shut down the spark context in zeppelin it still somehow persists and hangs. Workaround for me was to stop all services in ambari, reboot the system and start again. Then the session is flushed. (Found it here: https://stackoverflow.com/questions/35515120/why-does-sparkcontext-randomly-close-and-how-do-you-restart-it-from-zeppelin )
... View more
04-01-2019
08:37 AM
I ran into this error too, the problem is that the tutorial is wrong. It assumes you can visualize notebooks for temporary tables without using the spark sql interpreter. You need to use the spark sql interpreter to run the query against the table, this gives you visualisation options.
... View more
03-27-2019
01:59 PM
it is a single cluster environment, how can there be network issues? I did not change any configs, that is the weird thing.
... View more
03-26-2019
11:21 PM
In a single cluster test environment I suddenly cannot run any MR jobs or write to HDFS. I keep getting this error: $ hdfs dfs -put war-and-peace.txt /user/hands-on/
19/03/25 18:28:29 WARN hdfs.DataStreamer: Exception for BP-1098838250-127.0.0.1-1516469292616:blk_1073742374_1550
java.io.EOFException: Unexpected EOF while trying to read response from server
at org.apache.hadoop.hdfs.protocolPB.PBHelperClient.vintPrefixed(PBHelperClient.java:399)
at org.apache.hadoop.hdfs.protocol.datatransfer.PipelineAck.readFields(PipelineAck.java:213)
at org.apache.hadoop.hdfs.DataStreamer$ResponseProcessor.run(DataStreamer.java:1020)
put: All datanodes [DatanodeInfoWithStorage[127.0.0.1:50010,DS-b90326de-a499-4a43-a66a-cc3da83ea966,DISK]] are bad. Aborting... "hdfs dfsadmin -report" shows me everything is fine. $ hdfs dfsadmin -report
Configured Capacity: 52710469632 (49.09 GB)
Present Capacity: 43335585007 (40.36 GB)
DFS Remaining: 43334025216 (40.36 GB)
DFS Used: 1559791 (1.49 MB)
DFS Used%: 0.00%
Under replicated blocks: 0
Blocks with corrupt replicas: 0
Missing blocks: 0
Missing blocks (with replication factor 1): 0
Pending deletion blocks: 0
-------------------------------------------------
Live datanodes (1):
Name: 127.0.0.1:50010 (localhost)
Hostname: localhost
Decommission Status : Normal
Configured Capacity: 52710469632 (49.09 GB)
DFS Used: 1559791 (1.49 MB)
Non DFS Used: 6690530065 (6.23 GB)
DFS Remaining: 43334025216 (40.36 GB)
DFS Used%: 0.00%
DFS Remaining%: 82.21%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Xceivers: 2
Last contact: Mon Mar 25 18:30:45 EDT 2019 Any suggestions are appreciated.
... View more
Labels:
- Labels:
-
Apache Hadoop
03-24-2019
12:19 PM
Thank you, indeed it worked with javac -cp `hadoop classpath` -d wordcount_classes WordCount.java
... View more
03-23-2019
07:50 PM
Hi, I'm following Douglas Eadline's tutorial http://www.informit.com/store/hadoop-and-spark-fundamentals-livelessons-9780134770864 and try to compile the WordCount example, but I fail. Eadline suggests to comile it with hadoop-core.jar, which was renamed in newer versions as I understand. I tried with javac -classpath /usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core.jar -d wordcount_classes WordCount.java but I still get a load of errors. Any suggestions? Where can I find the correct jar? WordCount.java:27: error: package org.apache.hadoop.conf does not exist import org.apache.hadoop.conf.Configuration; ^ WordCount.java:28: error: package org.apache.hadoop.conf does not exist import org.apache.hadoop.conf.Configured; ^ WordCount.java:29: error: package org.apache.hadoop.fs does not exist import org.apache.hadoop.fs.Path; ^ WordCount.java:30: error: package org.apache.hadoop.io does not exist import org.apache.hadoop.io.IntWritable; ^ WordCount.java:31: error: package org.apache.hadoop.io does not exist import org.apache.hadoop.io.LongWritable; ^ WordCount.java:32: error: package org.apache.hadoop.io does not exist import org.apache.hadoop.io.Text; ^ WordCount.java:42: error: package org.apache.hadoop.util does not exist import org.apache.hadoop.util.Tool; ^ WordCount.java:43: error: package org.apache.hadoop.util does not exist import org.apache.hadoop.util.ToolRunner; ^ WordCount.java:54: error: cannot find symbol public class WordCount extends Configured implements Tool { ^ symbol: class Configured WordCount.java:54: error: cannot find symbol public class WordCount extends Configured implements Tool { ^ symbol: class Tool WordCount.java:61: error: cannot access Closeable public static class MapClass extends MapReduceBase ^ class file for org.apache.hadoop.io.Closeable not found WordCount.java:64: error: cannot find symbol private final static IntWritable one = new IntWritable(1); ^ symbol: class IntWritable location: class MapClass WordCount.java:65: error: cannot find symbol private Text word = new Text(); ^ symbol: class Text location: class MapClass WordCount.java:67: error: cannot find symbol public void map(LongWritable key, Text value, ^ symbol: class LongWritable location: class MapClass WordCount.java:67: error: cannot find symbol public void map(LongWritable key, Text value, ^ symbol: class Text location: class MapClass WordCount.java:68: error: cannot find symbol OutputCollector<Text, IntWritable> output, ^ symbol: class Text location: class MapClass WordCount.java:68: error: cannot find symbol OutputCollector<Text, IntWritable> output, ^ symbol: class IntWritable location: class MapClass WordCount.java:83: error: cannot find symbol implements Reducer<Text, IntWritable, Text, IntWritable> { ^ symbol: class Text location: class WordCount WordCount.java:83: error: cannot find symbol implements Reducer<Text, IntWritable, Text, IntWritable> { ^ symbol: class IntWritable location: class WordCount WordCount.java:83: error: cannot find symbol implements Reducer<Text, IntWritable, Text, IntWritable> { ^ symbol: class Text location: class WordCount WordCount.java:83: error: cannot find symbol implements Reducer<Text, IntWritable, Text, IntWritable> { ^ symbol: class IntWritable location: class WordCount WordCount.java:85: error: cannot find symbol public void reduce(Text key, Iterator<IntWritable> values, ^ symbol: class Text location: class Reduce WordCount.java:85: error: cannot find symbol public void reduce(Text key, Iterator<IntWritable> values, ^ symbol: class IntWritable location: class Reduce WordCount.java:86: error: cannot find symbol OutputCollector<Text, IntWritable> output, ^ symbol: class Text location: class Reduce WordCount.java:86: error: cannot find symbol OutputCollector<Text, IntWritable> output, ^ symbol: class IntWritable location: class Reduce
... View more
Labels: