Member since
12-22-2016
11
Posts
6
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
15134 | 01-04-2017 08:43 PM |
05-24-2017
08:52 PM
I am trying to access phoenix table using interpretor like this %jdbc(phoenix)
select * from xxxxxx the error what I am getting is java.lang.ClassLoader.loadClass(ClassLoader.java:357)
java.lang.Class.forName0(Native Method)
java.lang.Class.forName(Class.java:264)
org.apache.zeppelin.jdbc.JDBCInterpreter.getConnection(JDBCInterpreter.java:214)
org.apache.zeppelin.jdbc.JDBCInterpreter.getStatement(JDBCInterpreter.java:275)
org.apache.zeppelin.jdbc.JDBCInterpreter.executeSql(JDBCInterpreter.java:336)
org.apache.zeppelin.jdbc.JDBCInterpreter.interpret(JDBCInterpreter.java:442)
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
org.apache.zeppelin.scheduler.Job.run(Job.java:176)
org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162)
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
java.util.concurrent.FutureTask.run(FutureTask.java:266)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) I have added the org.apache.phoenix.jdbc.PhoenixDriver as articraft in zeppline interpretor of jdbc
... View more
Labels:
- Labels:
-
Apache Zeppelin
05-12-2017
06:23 AM
I am using %Spark Interpretor and executing a jdbc code to access the phoenix table and renamming the dataframe as "cool" I am able to printschema of the dataframe. Now in second paragraph , using %sql interpretor , I am trying to read the dataframe and display the contents using select statement then I am getting the following error The code is as follows :- 1st paragraph %spark
val table = sqlContext.read.format("jdbc").options( Map( "driver" -> "org.apache.phoenix.jdbc.PhoenixDriver", "url" -> "jdbc:phoenix:<hbase_server>:2181:/hbase-unsecure", "dbtable" -> "FBC_DEV_CORR")).load()
table.registerTempTable("cool") 2nd paragraph %sqlselect * from cool Error:- java.lang.IllegalStateException: SparkContext has been shutdown
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1848)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1869)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1882)
at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:212)
at org.apache.spark.sql.execution.Limit.executeCollect(basicOperators.scala:165)
at org.apache.spark.sql.execution.SparkPlan.executeCollectPublic(SparkPlan.scala:174)
at org.apache.spark.sql.DataFrame$anonfun$org$apache$spark$sql$DataFrame$execute$1$1.apply(DataFrame.scala:1499)
at org.apache.spark.sql.DataFrame$anonfun$org$apache$spark$sql$DataFrame$execute$1$1.apply(DataFrame.scala:1499)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:56)
at org.apache.spark.sql.DataFrame.withNewExecutionId(DataFrame.scala:2086)
at org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$execute$1(DataFrame.scala:1498)
at org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$collect(DataFrame.scala:1505)
at org.apache.spark.sql.DataFrame$anonfun$head$1.apply(DataFrame.scala:1375)
at org.apache.spark.sql.DataFrame$anonfun$head$1.apply(DataFrame.scala:1374)
at org.apache.spark.sql.DataFrame.withCallback(DataFrame.scala:2099)
at org.apache.spark.sql.DataFrame.head(DataFrame.scala:1374)
at org.apache.spark.sql.DataFrame.take(DataFrame.scala:1456)
at org.apache.spark.sql.DataFrame.showString(DataFrame.scala:170)
at org.apache.spark.sql.DataFrame.show(DataFrame.scala:350)
at org.apache.spark.sql.DataFrame.show(DataFrame.scala:311)
at org.apache.spark.sql.DataFrame.show(DataFrame.scala:319)
at $iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC.<init>(<console>:40)
at $iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC.<init>(<console>:45)
at $iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC.<init>(<console>:47)
at $iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC.<init>(<console>:49)
at $iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC.<init>(<console>:51)
at $iwC$iwC$iwC$iwC$iwC$iwC$iwC.<init>(<console>:53)
at $iwC$iwC$iwC$iwC$iwC$iwC.<init>(<console>:55)
at $iwC$iwC$iwC$iwC$iwC.<init>(<console>:57)
at $iwC$iwC$iwC$iwC.<init>(<console>:59)
at $iwC$iwC$iwC.<init>(<console>:61)
at $iwC$iwC.<init>(<console>:63)
at $iwC.<init>(<console>:65)
at <init>(<console>:67)
at .<init>(<console>:71)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
at org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:717)
at org.apache.zeppelin.spark.SparkInterpreter.interpretInput(SparkInterpreter.java:928)
at org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:871)
at org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:864)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Note:- I tried accessing the table in paragraph 1 itself by following code but getting the same error. val roger = sqlContext.sql("select * from cool limit 10")
roger.show()
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache Zeppelin
05-09-2017
10:12 PM
I started using %jdbc(phoenix) interpretor which is provided by default. I am trying to see the list of tables by using !tables of phoenix , and getting the following error. ERROR 601 (42P00): Syntax error. Unexpected char: '!'
class org.apache.phoenix.exception.PhoenixParserException
org.apache.phoenix.exception.PhoenixParserException.newException(PhoenixParserException.java:33)
org.apache.phoenix.parse.SQLParser.parseStatement(SQLParser.java:118)
org.apache.phoenix.jdbc.PhoenixStatement$PhoenixStatementParser.parseStatement(PhoenixStatement.java:1185)
org.apache.phoenix.jdbc.PhoenixStatement.parseStatement(PhoenixStatement.java:1268)
org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1339)
org.apache.zeppelin.jdbc.JDBCInterpreter.executeSql(JDBCInterpreter.java:356)
org.apache.zeppelin.jdbc.JDBCInterpreter.interpret(JDBCInterpreter.java:442)
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
org.apache.zeppelin.scheduler.Job.run(Job.java:176)
org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162)
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
java.util.concurrent.FutureTask.run(FutureTask.java:266)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
java.lang.Thread.run(Thread.java:745)
... View more
Labels:
- Labels:
-
Apache Phoenix
-
Apache Zeppelin
01-04-2017
08:43 PM
2 Kudos
Yes , I figured out how to run a jar in nifi using ExectureStreamCommand processor. I am having a piece of java code , along with a jar file , by creating manifest file . I created another Master Jar file , which need to be run in nifi environment to get data.
refer :-
http://stackoverflow.com/questions/8890747/creating-a-jar-file-which-contains-other-library-files
http://stackoverflow.com/questions/1082580/how-to-build-jars-from-intellij-properly
Now once we have our JAR ready , we just need to give java -jar <jarfilename>.jar conventionally .
In NIFI , we need to use ExectureStreamCommand processor .
to trigger the action , step 1) create a GenerateFlowFile
2) In ExectureStreamCommand processor , update the properties as follows. ( here my jar name is engagement_3.jar) and we all know that the command to run java jar is java -jar <jarfilename>.jar but we need to omit "java" and give rest of command , but the delimiter is ";" .
So the command is -jar;engagement_3.jar or -jar;<jarfilename>.jar
In COMMAND PATH, specify the system environment path , <JAVA_HOME>\bin\java BUT NOT <JAVA_HOME>\bin\java.exe. Working directory is the location where your jar is hosted in local. Rest of the properties within ExectureStreamCommand processor are as follows.
... View more
01-03-2017
08:25 PM
1 Kudo
Greetings Community , I am aware of executeScript processor and InvokeScriptedProcessor where we can run python , ruby and groovy type code but I am looking for a processor where I can run java code with additional functionality to add external jars as well. I found that ExecuteStreamCommand processor can be used but did not find any examples or ways to run java using that. Can anyone direct me to examples / templates where a java code or jar is executed .
... View more
Labels:
- Labels:
-
Apache NiFi
01-03-2017
07:00 PM
Thanks for reply Matt , I carefully read your reply , I only find python execute engine but not jython processor in both ExectuteScript and InvokeScriptedProcessor.
... View more
01-03-2017
04:49 PM
1 Kudo
Greetings Community , I am working on a python scipt , which uses http post request to get data . I have used invokehttp processor and worked hard to get data using it , but later figured out that there are some authentication libraries by the API developer . So I need to execute the specific python code provided by API developer to get data. Now , I am trying to run the python code with some additional libraries . I have taken reference of below link where in groovy additional libraries are hosted in local directory and path is mentioned in MODULE DIRECTORY in ExecuteScript Processor. https://community.hortonworks.com/questions/47493/nifi-executescript-using-external-libarries-with-g.html I am doing the same by specifiying the python libraries location (folder location) using comma separated Tried forward and back slash as well while specifying folder location , even tried specifying the main python file ( __init.py__) but for all these attempts the error is as follows No Module named xxx NOT FOUND in line XXX .
Here is the what I mentioned in Module Directory in ExectuteScript Processor.
C:\Users\pjalla\AppData\Local\Programs\Python\Python35-32\Lib\site-packages\requests\__init__.py,C:\Users\pjalla\AppData\Local\Programs\Python\Python35-32\Lib\json\__init__.py,C:\Users\pjalla\AppData\Local\Programs\Python\Python35-32\Lib\site-packages\requests_oauthlib\oauth1_auth.py,C:\Users\pjalla\AppData\Local\Programs\Python\Python35-32\Lib\site-packages\requests_oauthlib\oauth1_session.py
... View more
Labels:
- Labels:
-
Apache NiFi