Member since
06-28-2017
279
Posts
43
Kudos Received
24
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1957 | 12-24-2018 08:34 AM | |
5304 | 12-24-2018 08:21 AM | |
2159 | 08-23-2018 07:09 AM | |
9439 | 08-21-2018 05:50 PM | |
5054 | 08-20-2018 10:59 AM |
08-27-2018
08:39 AM
1 Kudo
It's described at the link, but its just a few steps (actually for a test setup): docker pull registry
docker run -d -p 5000:5000 --restart always --name registry registry:2 Now on the machine where you executed the above commands, a docker registry is available at port 5000. The parameter have this meaning: -d: Run container in background and print container ID -p: Publish a container’s port(s) to the host --restart: Restart policy to apply when a container exits --name: Assign a name to the container For further options refer to: https://docs.docker.com/engine/reference/commandline/run/ For informations on how to set it up for real use (not testing nor demonstration): https://docs.docker.com/registry/deploying/ The last link provides you with important information like how to set up keys and use it in an orchestrated environment.
... View more
08-23-2018
10:58 AM
Would be great if you 'accept' the answer if you consider it helpful.
... View more
08-23-2018
07:09 AM
1 Kudo
There might be an application storing the data already in Hbase and other people like to query this data in an sql manner, or want to combine it with data from other Hive tables. it is also possible that the amount of data getting inserted or updated is an argument for using Hbase. In principal Hbase has some features to handle high amounts of data pretty fast with memory based processing, while Hive itself is a SQL layer, using other storage engines, resulting in the data being stored one or the other way in hdfs (or whatever your storage system is). Hbase also uses hdfs as the persistence layer, but the data inserted is available for queries even before the write operation to disk takes place. So a typical use case is that data is inserted and updated online in Hbase, while someone needs to combine that data with other data in SQL queries. I think it is much less usual to insert and update Hbase tables only via Hive, but reasons could be very different anyway, i.e. the policies by the ops team, know-how of involved people, a cluster having evolved using different tools, established dev or ops procedure etc...
... View more
08-23-2018
06:43 AM
How do you initialize/load the Azure libs? Maybe you will have to configure the workdir in your processor as well? Do you have any error log? Anyway it might be a good idea to close this issue (by accepting one answer) and create a new post with the next issue your are facing (your script seems to not connect correctly).
... View more
08-22-2018
12:59 PM
Ok, then let's try the following: Command Path: full path of python.exe Command Arguments: full path of script file; arguments This way you would be actually calling an exe, as needed, and the arguments should let your Python interpreter start your python script.
... View more
08-22-2018
12:09 PM
Is this what your are basically following: https://community.hortonworks.com/articles/35568/python-script-in-nifi.html , or are you more or less following this https://community.hortonworks.com/questions/106802/execute-python-script-with-nifi.html Are you defining the script inside the ExecuteScript processor, or are you calling an existing script with the ExecuteProcess or ExecuteStreamProcess processor? In the last case the answer from Matt includes a hint to your issue: if you are using "pure" Python modules (meaning their code and dependencies are all Python and do not use natively compiled modules -- no CPython or .so/.dll, etc. -- you can use ExecuteScript or InvokeScriptedProcessor. This uses a Jython (not Python) engine to execute your script.
... View more
08-22-2018
11:52 AM
This depends on the call you actually make. If you end up calling CreateProcess of the Win32 API this is true (you can only start an exe application with that API call). And your error message makes this quite likely, as that's the exact error you get from the Win32 API. You start the Python script locally? On a Windows machine? And is your local Python script making the call? Or is it the Nifi Flow calling a Python script on the server that is failing? Or maybe the server side Python script makes a call that fails?
... View more
08-22-2018
07:50 AM
It would be important to see the command itself and the call of it to be sure. But if you are using the Win32 API call CreateProcess you can only call programs in the EXE format. Especially you are not able to call any shell script or jar with it.
... View more
08-21-2018
05:50 PM
The answer that tells that environment variables are not possible is from 1 year ago, the latest docker image of Apache Nifi at the docker hub is 1 month old. I might be wrong, but the startscript in the container seems to set the variables in the config file, so both might be true, the nifi.properties still doesn't allow environment vars, but the docker image does it. Anyway my assumption is that your error isn't related to the environmental variables, but to something more basic, as the errors says "0: illegal option -", and we need to figure it out. Does your start-script screenshot show the $NIFI_HOME/bin/nifi-env.sh from the docker image?
... View more
08-21-2018
03:57 PM
Glad it is working. It would be great if you could then also accept the answer. This will indicate other viewers your issue is solved, and increases your and mine reputation points.
... View more