Member since
03-25-2016
142
Posts
48
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5774 | 06-13-2017 05:15 AM | |
1903 | 05-16-2017 05:20 AM | |
1341 | 03-06-2017 11:20 AM | |
7855 | 02-23-2017 06:59 AM | |
2223 | 02-20-2017 02:19 PM |
05-16-2017
05:22 AM
Hi @bhagan, Thanks for your comment. That was also my assumption but was looking for confirmation.
... View more
10-19-2017
06:59 PM
Worked for me once I installed the correct version of epel If that doesn't work for you and you get: (check that epel install centos7 and not centos6) Error: Package: R-core-3.4.1-1.el6.x86_64 (epel)
Requires: libicui18n.so.42()(64bit)
Error: Package: R-core-3.4.1-1.el6.x86_64 (epel)
Requires: libicuuc.so.42()(64bit)
Error: Package: qpdf-libs-5.1.1-5.el6.x86_64 (epel)
Requires: libpcre.so.0()(64bit)
FYI to fix this you can just edit the repo file to point to 7 instead of 6 (and fix the gpg check)
... View more
04-26-2017
05:45 AM
PROBLEM: Running the job in notebook through Zeppelin sometimes I am getting 500 error. The following can be seen in livy-livy-server.log file ...
17/04/25 14:04:30 ERROR SessionServlet$: internal error
com.fasterxml.jackson.core.JsonParseException: Illegal unquoted character ((CTRL-CHAR, code 13)): has to be escaped using backslash to be included in string value
at [Source: HttpInputOverHTTP@62c493f2; line: 1, column: 76]
at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1419)
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:508)
at com.fasterxml.jackson.core.base.ParserMinimalBase._throwUnquotedSpace(ParserMinimalBase.java:472)
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._finishString2(UTF8StreamJsonParser.java:2235)
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._finishString(UTF8StreamJsonParser.java:2165)
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.getText(UTF8StreamJsonParser.java:279)
at com.fasterxml.jackson.databind.deser.std.StringDeserializer.deserialize(StringDeserializer.java:29)
at com.fasterxml.jackson.databind.deser.std.StringDeserializer.deserialize(StringDeserializer.java:12)
at com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:538)
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:344)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1064)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:264)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:124)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3066)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2207) at com.cloudera.livy.server.JsonServlet.bodyAs(JsonServlet.scala:102)
...
SOLUTION: This problem happens when the code for notebook is copied over from ie. notepad, textpad.
To fix it, type the code manually.
... View more
Labels:
04-25-2017
08:08 AM
PROBLEM:
I have just started with livy through Zeppelin and have made some livy interpreter config changes i.e. spark memory. When running
%livy.spark
sc.version
I am getting
Cannot start spark SOLUTION: When checking the application log from RM UI I noticed the following ...
End of LogType:launch_container.sh
LogType:stderr
Log Upload Time:Tue Apr 18 15:13:02 +0200 2017
LogLength:142
Log Contents:
Invalid maximum heap size: -Xmx0m
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
... As you can see, the Xmx value is set to 0m. Doing the further research I noticed the following in livy interpreter: livy.spark.executor.memory 512
livy.spark.driver.memory 512
Seting the values to livy.spark.executor.memory 512m
livy.spark.driver.memory 512m
saving the changes and restarting livy interpreter fixed the problem
... View more
Labels:
04-24-2017
01:24 PM
Problem This problem happened on HDP 2.5.3 when running Spark On HBase. Here is the error seen in the application log: ...
17/04/11 10:12:04 WARN RecoverableZooKeeper: Possibly transient ZooKeeper, quorum=localhost:2181, exception=org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
17/04/11 10:12:05 INFO ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)
17/04/11 10:12:05 WARN ClientCnxn: Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.zookeeper.ClientCnxnSocketNIO.doTransport(ClientCnxnSocketNIO.java:361)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1125)
...
Solution To fix that problem ensure that hbase-site.xml file exists in /etc/spark/conf on each of the NodeManager node.
... View more
Labels:
10-09-2018
01:35 PM
It is recommended to disable the cron in zeppelin as there is no validation if the user is allowed to run as the hive/hdfs or any other user specified in the cron field, meaning any user can set it to be run as any user. disbale is possible from 2.6.5, for previous versiosn please engage the hortonworks support.
... View more
07-16-2018
06:22 AM
@Daniel Kozlowski :- The kill solution will work in "client" mode. In cluster mode, driver would be any node of the cluster. Assuming, we dont have ssh access to that node, how can one kill the driver?
... View more
02-06-2018
06:35 AM
@Lekya Goriparti Have a look at this: https://community.hortonworks.com/questions/26622/the-node-hbase-is-not-in-zookeeper-it-should-have.html
... View more
05-10-2017
04:58 AM
@azelmad zakaria As this is an article, raise a separate question in HCC, refer to this one and provide the full stack trace from your console
... View more
03-14-2017
01:54 PM
Environment - HDP 2.5.3 - Kerberos disabled Problem I have a problem to use hiveContext with zeppelin. For example this code does not works: %pyspark
from pyspark.sql import HiveContext
sqlContext = HiveContext(sc)
sample07 = sqlContext.table("default.sample_07")
sample07.show()
Here is the error displayed: You must build Spark with Hive. Export 'SPARK_HIVE=true' and run build/sbt assembly
Py4JJavaError: An error occurred while calling None.org.apache.spark.sql.hive.HiveContext.
: java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOException: Permission denied
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:204)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:225)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:215)
at org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:480)
at org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:479)
at org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40)
at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
at py4j.Gateway.invoke(Gateway.java:214)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
at py4j.GatewayConnection.run(GatewayConnection.java:209)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: java.io.IOException: Permission denied
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:515)
... 21 more
Caused by: java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.createTempFile(File.java:2001)
at org.apache.hadoop.hive.ql.session.SessionState.createTempFile(SessionState.java:818)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513)
... 21 more
(<class 'py4j.protocol.Py4JJavaError'>, Py4JJavaError(u'An error occurred while calling None.org.apache.spark.sql.hive.HiveContext.\n', JavaObject id=o125), <traceback object at 0x17682d8>)
Solution Even though you are logged in to Zeppelin UI as a user from AD/LDAP/local, the query gets executed as zeppelin user. Hence, zeppelin user needs to have a written permission to where hive.exec.local.scratchdir parameter indicates. As default, it is set to /tmp/<userName>. So, the following needs to exist in zeppelin node: [root@dan2 ~]# ls -lrt /tmp
drwxr-xr-x. 20 zeppelin zeppelin 4096 Mar 10 16:46 zeppelin
... View more
Labels: