Member since
05-02-2016
154
Posts
54
Kudos Received
14
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2127 | 07-24-2018 06:34 PM | |
3327 | 09-28-2017 01:53 PM | |
790 | 02-22-2017 05:18 PM | |
7684 | 01-13-2017 10:07 PM | |
1923 | 12-15-2016 06:00 AM |
01-13-2017
07:18 PM
if you have some other process running on port 8080, you can change nifi to use a different port by changing the value in conf/nifi.properties. replace 8080 with whatever port you want.
... View more
01-13-2017
07:13 PM
@Ranjit S can you confirm nothing else is running on port 8080. This is port used by nifi for web-ui. You can look into logs/nifi-bootstrap.log and logs/nifi-app.log and see if there are any errors.
... View more
01-13-2017
02:20 PM
do you have any logs for the internal server error. It may be in the nidi-app.log file.
... View more
01-06-2017
04:21 PM
is it possible that the timezone was changed after you started Nifi, if possible try restarting the nifi, and it should update to the correct time zone.
... View more
01-06-2017
03:54 PM
do you have NiFi installed on your machine locally, or is it on a server. It could be possible that NiFi is reflecting the time of the server on the UI.
... View more
12-19-2016
03:00 PM
you may be missing something in the s3 policy. Looks like the code is trying to read the data after uploading and then failing. May you should give permissions to read what is being uploaded to the bucket.
... View more
12-15-2016
07:08 PM
you could use a simple python script with executescript to achieve that. http://funnifi.blogspot.com/2016/03/executescript-json-to-json-revisited_14.html
... View more
12-15-2016
05:20 PM
is the port 16020 open on the nodes. especially hostname-007.localdomain.local
... View more
12-15-2016
03:42 PM
@regie canada no you dont have to, it should work out of the box.
... View more
12-15-2016
03:38 PM
1 Kudo
https://community.hortonworks.com/articles/71719/using-snappy-and-other-compressions-with-nifi-hdfs.... I posted the above link on how you can use compression, try and let me know how it goes. I don't think you can use LZO as it is not shipped as part of Nifi. You can try doing a yum install of LZO and then use executestream to compress the file and then do puthdfs, with none for compression codec.
... View more
12-15-2016
03:11 PM
5 Kudos
A usual query that comes up is on using snappy and other compression codes when loading data from hdfs using the nifi puthdfs component. A common error that users come across is "java.lang.UnsatisfiedLinkError". This error occurs because snappy and other compression codecs, which are part of the native linux binaries and which the java libraries for these codecs utilize to do the actual compression, are not in the path of the JVM. Since the JVM cannot find them the bindings fails and you get a java.lang.UnsatisfiedLinkError. Follow the following steps to resolve this issue. 1. copy the native folder containing the compression libraries, from one of your hadoop nodes. cd /usr/hdp/x.x.x.x-xx/hadoop/lib/ tar -cf ~/native.tar native/ 2. scp the native.tar from your hadoop node to your NiFi node, untar to a location of your choice. in my case i use /home/myuser/hadoop/ cd ~ mkdir hadoop cd hadoop tar -cf /path/to/native.tar 3. go to your nifi folder , and open conf/bootstrap.conf and add the following jvn argument for java.library.path pointing to your folder containing the native hadoop binaries. (/home/myuser/hadoop/native in my case ) java.arg.15=-Djava.library.path=/home/myuser/hadoop/native/ i used 15 because that was the number for the last jvm argument in my bootstrap.conf. Alternately, you can edit bin/nifi-env.sh and add export LD_LIBRARY_PATH=/home/mysuser/hadoop/native/ 4. restart nifi.
... View more
- Find more articles tagged with:
- gethdfs
- HDFS
- How-ToTutorial
- NiFi
- nifi-processor
- Sandbox & Learning
- snappy
Labels:
12-15-2016
06:36 AM
you may need header information to convert this to json, technically even the xsl output you have is CSV, just delimited with pipe. I think you can directly use the executescript processor, to call a python script, and go from xsl to json..
getfile (read the file with xsl data ) -> splittext (split the data into lines )-->execute script (with script below to convert to json) --> merge content (merge contents based on fragment.identifier attribute of split text) --> put file (gives you json files )
--- example scrip for converting to json.. import json
import java.io
from org.apache.commons.io import IOUtils
from java.nio.charset import StandardCharsets
from org.apache.nifi.processor.io import StreamCallback
class PyStreamCallback(StreamCallback):
def __init__(self):
pass
def process(self, inputStream, outputStream):
header=["column1","column2","column3"...] # the header for the xsl, this will become the name for json nodes
text = IOUtils.toString(inputStream, StandardCharsets.UTF_8)
output={}
for column in text.split("|"):
index=0 # counter to keep track of column, so we can assing a name to the value.
output[header(index)]=column
outputStream.write(bytearray(json.dumps(output, indent=4).encode('utf-8')))
flowFile = session.get()
if (flowFile != None):
flowFile = session.write(flowFile,PyStreamCallback())
flowFile = session.putAttribute(flowFile, "filename", flowFile.getAttribute('filename').split('.')[0]+'_translated.json')
session.transfer(flowFile, REL_SUCCESS)
... View more
12-15-2016
06:16 AM
i guess based on your comment, use replacetext processor to replace all occurrence of | with ,.
... View more
12-15-2016
06:00 AM
1 Kudo
@regie canada you want to convert XSL to CSV? just wanting to confirm you didn't mean XLS.
... View more
12-14-2016
12:35 AM
any chance you are on OS X?
... View more
12-13-2016
02:56 PM
in your error logs , it says that the com.mysql.jdbc.driver could not be located. can you see if the mysql jdbc jar is in the location used by the service check. As per logs it should be at /usr/hdp/current/hive-server2/lib/mysql-connector-java.jar.
... View more
12-12-2016
07:08 PM
1 Kudo
you will have to include the path to lzo codec binaries in the NiFi bootstrap script. add an entry like so - java.arg.15=-Djava.libaray.path=/path/to/your/lzocodec.so in the bootstrap.conf file.
... View more
12-07-2016
05:25 PM
1 Kudo
you can use expression language. in you selecthiveql query you put your query as select * from tmp where last_name=${name}.. name will be replaced by the attribute value from your previous processors flow file. So add an updateAttribute before the selecthiveql processor and add attribute name with whatever value you want to set.
... View more
12-07-2016
05:03 PM
try setting install.mvn to false and see if that helps.
... View more
12-07-2016
04:19 PM
not sure what the issue is .. are you following this guide https://github.com/hortonworks-gallery/ambari-vnc-service the only difference i see between the guide and your screenshot is the intellij location, you have https, while the guide has http. try that.
... View more
12-07-2016
04:03 PM
in the curl where you get error you are using https and in the curl command from ssh, you are using http.
... View more
12-07-2016
03:13 PM
to be clear modify vi /etc/sudoers in each node to be allow ambari2 to sudo without password. ambari2 ALL=(ALL) NOPASSWD: ALL then change ssh command "ssh -t ambari2@host "sudo systemctl ambari-agent restart"
... View more
12-07-2016
03:05 PM
in your ssh command ssh ambari2@$host, you have ambari2 and in your question you say you installed as ambari, so just wanted to confirm ambari2 is correct. so if you are able to ssh correctly, you have two issue, i think ambari-agent will run as root, so you need to do a sudo. so u have to pass -t to you ssh command. ssh -t ambari2@host (alternately you can disable requiretty in /etc/sudoers ). So in /etc/sudoers of each remote machine, you have to enable ambari to be able to run command without password. like this ambari2 ALL=(ALL) NOPASSWD: ALL
... View more
12-07-2016
02:43 PM
you have ambari2 as the username, may be you need to use ambari. Also, has ambari user been setup with password less ssh to the other nodes.
... View more
12-07-2016
02:43 PM
you have ambari2 as the username, may be you need to use ambari. Also, has ambari user been setup with password less ssh to the other nodes.
... View more
12-07-2016
03:39 AM
try that !connect jdbc:hive2://sandbox.hortonworks.com:10001/default;principal=hive/sandbox.hortonworks.com@EXAMPLE.COM are you sure that is the correct principal? When you use kerberos, you are using a key to provide credentials. You do not need username and password
... View more
12-06-2016
05:14 AM
send 2, don't send 3. it has to be the same version. it will become 3 when it gets modified.
... View more
12-06-2016
04:46 AM
make sure the id is correct by first doing a get, it will give you the correct version id. you have to send the version id that get returns.
... View more
12-01-2016
04:23 AM
I have done the needed code changes. Unfortunately i do not have a way to test this. If i sent you the NAR with the code change, would you be able to test it. Basically, just swap out the nifi-standar-nar-x.x.x from the lib, with the one i send you. Please test it on a local, non-prod instance.
... View more
- « Previous
- Next »