Member since
09-21-2016
14
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
26654 | 09-27-2016 08:18 AM |
01-03-2018
11:49 PM
Hi @Namit Maheshwari in case of distcp between source HA and destination HA clusters where secure zones are identical and having the same EZ key, case of files having the different EDEKs comes into picture or otherwise. I mean given the above case, does distcp of a file from src secure zone to dest secure zone gives the checksum mismatch error or not? if yes so, when distcp copies the file to dest secure zone, file gets decrypted on source first and then transferred over the wire and on destination encrypted with different EDEK. is that the statement correct?
... View more
09-12-2017
08:10 AM
1 Kudo
Thanks @Kshitij Badani. you understood my query correctly. I need to check if i can make angular front end (lets say a Button) which user can click and run a angularBind paragraph in report mode where user has only read access. I will post my answer if I find that working.
... View more
09-07-2017
04:09 PM
Hi All, I am trying to find out the answer to the question whether it is possible to share zeppelin notebook to a user in report mode and allow him/her to run the paragraphs but user is not allowed to view the code or change the code. Note: 1. when I provide only read permission to user and share it in report mode, user is able to view the notebook but not allowed the run the paragraphs. 2. when i provide both read and write permissions then user is not only allowed to run the code but also able to view the code and change the mode as well. -let me know if further clarification is needed around my query. @Timothy Spann @Paul Hargis @Kshitij Badani
... View more
Labels:
- Labels:
-
Apache Zeppelin
09-27-2016
08:18 AM
Hi @Sindhu , I think I got the solution now. I was missing "package" keyword while creating jar using maven. I was using below command earlier : - - > mvn clean compile assembly:single Now I have changed the command to : - - > mvn clean package assembly:single After building the jar using above command, spark job is running fine. Thanks a lot @Sindhu for your help. I got the solution because of your help.
... View more
09-27-2016
08:01 AM
@Sindhu Here is the output of jar tvf sparkPhoenixHbase-1.0-SNAPSHOT-job.jar. output.txt
... View more
09-27-2016
07:44 AM
Hi @Sindhu I tried like you said. still I am getting below error :
java.lang.ClassNotFoundException: SparkPhoenixHbase at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:270) at org.apache.spark.util.Utils$.classForName(Utils.scala:173) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:652) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
... View more
09-27-2016
07:23 AM
I am trying the launch spark job using spark-submit. But I am getting the error as "Cannot load main class from JAR file" spark-submit \ --verbose --master local[4] \ --class com.training.bigdata.SparkPhoenixHbase sparkPhoenixHbase-1.0-SNAPSHOT-job.jar
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache Sqoop
09-24-2016
09:06 AM
I am getting this error while appending a local file to hdfs file.
... View more
Labels:
- Labels:
-
Apache Hadoop
09-22-2016
09:45 AM
Hi @Timothy Spann and @Jasper , I found the cause of issue now. The issue was I was not putting colon (: ) between port(2181) and hbase tablespace(hbase-unsecure) in spark-shell properly while loading the table. - Earlier I was loading the table in spark-shell as below, which was giving me no Table found error. val jdbcDF = sqlContext.read.format("jdbc").options( Map( "driver" -> "org.apache.phoenix.jdbc.PhoenixDriver", "url" -> "jdbc:phoenix:<host>:2181/hbase-unsecure", "dbtable" -> "TEST_TABLE2") ).load() - But now after putting colon ( : ) between port(2181) number andhbase tablespace (hbase-unsecure). I am able to load table. val jdbcDF = sqlContext.read.format("jdbc").options( Map( "driver" -> "org.apache.phoenix.jdbc.PhoenixDriver", "url" -> "jdbc:phoenix:<host>:2181:/hbase-unsecure", "dbtable" -> "TEST_TABLE2") ).load()
... View more
09-22-2016
09:42 AM
Thanks a lot @Timothy Spann for your help. Now I am able to load the phoenix table in spark. I was missing a colon ( : ) between port(2181) number and hbase tablespace (hbase-unsecure) while loading the table in spark. Now after correcting the issue, spark is loading the phoenix table.
... View more