Member since
04-29-2021
35
Posts
0
Kudos Received
0
Solutions
09-02-2021
02:28 AM
Hi @MattWho What would be the regular expression if I have to put the selection condition on field three of the data. the field I put in bold. I want to select the lines with the 1995 only. |226789|23-Feb-1996|1995|0|1|1|0|0|0|1|0|0|0|0|0|0|0|0|0|0|0|0 |226780|08-Mar-1996|1996|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0 |222507|01-Jan-1995|1995|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0 |22308|01-Jan-1995|1995|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0 |222707|01-Jan-1995|1995|0|1|0|0|0|0|0|0|1|0|0|0|0|0|0|0|1|0|0
... View more
09-01-2021
08:50 AM
Hi Everyone, I use ListSFTP and FetchSFTP to collect the files that lines. I want to filter the files based on the third field. I want to collect the files that have the year 1995 only in the lines.
|226789|23-Feb-1996|1995|0|1|1|0|0|0|1|0|0|0|0|0|0|0|0|0|0|0|0
|226780|08-Mar-1996|1996|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0
|222507|01-Jan-1995|1995|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0
|22308|01-Jan-1995|1995|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0
|222707|01-Jan-1995|1995|0|1|0|0|0|0|0|0|1|0|0|0|0|0|0|0|1|0|0
... View more
Labels:
- Labels:
-
Apache NiFi
08-17-2021
12:52 AM
Hi Everyone, I have collected data in ASN1 format, I want to convert them using Apache Nifi. Which Nifi processors can help me?
... View more
- Tags:
- NiFi
Labels:
- Labels:
-
Apache NiFi
07-16-2021
07:12 AM
Hi @stevenmatison, I don't see any data in the cassandra database. I don't see any errors either. Here are the screenshots.
... View more
07-15-2021
09:20 AM
Hi @stevenmatison, The goal is to be able to extend to several servers, in order to collect the files in a parallel way to count the number of lines and the number of files to store in a database each day.the type of data contained in the data file |226789|23-Feb-1996|0|1|1|0|0|0|1|0|3|0|0|6|0|2|0|0|6|9|7 |226780|08-Mar-1996|4|0|2|0|0|1|0|0|0|3|0|0|0|0|0|0|0|0|8 |222507|01-Jan-1995|0|0|5|0|0|1|0|0|0|0|6|0|0|0|0|0|0|0|5 |22308|01-Jan-1995|0|1|8|0|0|6|0|0|0|2|0|4|0|0|0|6|0|0|4 |222707|01-Jan-1995|0|1|0|0|5|0|0|0|1|0|0|6|0|0|7|0|1|0|2 I collect files I count the number of files and the number of lines in the files. I want to store these values in a database. I installed apache cassandra and created a database and a table. when I insert the number of lines and documents, I check my table and I don't see any data. GetFile-->MergeContent-->CountText-->ReplaceText-->PutCassandraRecord. The processors that make up the nifi flowfile I set up. flowfile (all processors) PutCassandraRecord I want to store the number of files obtained and the number of lines obtained in cassandra if possible. So concerning the configuration of PutcassandraRecord you can make me a proposal.
... View more
07-14-2021
11:38 AM
Hi Everyone, The goal is to be able to extend to several servers, in order to collect the files in a parallel way to count the number of lines and the number of files to store in a database each day. I collect files I count the number of files and the number of lines in the files. I want to store these values in a database. I installed apache cassandra and created a database and a table. when I insert the number of lines and documents, I check my table and I don't see any data. GetFile-->MergeContent-->CountText-->ReplaceText-->PutCassandraRecord I want to check the data entered in the database table but I don't see any data. here is the configuration of the PutcassandraRecord processor. PutCassandraRecord
... View more
- Tags:
- NiFi
Labels:
- Labels:
-
Apache MiNiFi
-
Apache NiFi
06-21-2021
11:03 PM
Hi @ChethanYM, I just use hadoop. I followed This tutorial , but I did the installation steps of hadoop only. https://www.youtube.com/watch?v=71EQblrUPRM&t=1375s
... View more
06-21-2021
10:23 AM
Hi @ChethanYM, I installed hadoop-2.7.1 and it's during the installation that I created a hdfs user. but the hdfs command started working when I finished installing hadoop. here is the output of the command #ls -lrth | grep hdfs. the output does not give anything. Yes I can reinstall hadoop. [root@MASTER alternatives]# ls -lrth | grep hdfs [root@MASTER alternatives]# [root@MASTER alternatives]# hadoop version Hadoop 2.7.1 Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 20fe5304904fc2f5a18053c389e43cd26f7a70fe Compiled by vinodkv on 2017-06-02T06:14Z Compiled with protoc 2.5.0 From source with checksum 60125541c2b3e266cbf3becc5bda666 This command was run using /usr/local/hadoop-2.7.1/share/hadoop/common/hadoop-common-2.7.1.jar [root@MASTER alternatives]#
... View more
06-21-2021
09:05 AM
Hi @ChethanYM, I set the PATH like below and try: When I run this command, I get the same error. ## PATH=$PATH:$HADOOP_HOME/bin ## source ~/.bash_profile [root@MASTER ~]# which hdfs /usr/local/hadoop-2.7.1/bin/ hdfs [root@MASTER ~]# sudo -u hdfs hdfs dfsadmin -safemode leave sudo: hdfs: command not found [root@MASTER ~]# id hdfs uid=30008(hdfs) gid=30009(hdfs) groups=30009(hdfs) [root@MASTER ~]#
... View more
06-21-2021
01:59 AM
Hi @RohitPathak, I tried with root/sudo user but still error, [root@MASTER ~]# sudo -u hdfs hdfs dfsadmin -safemode leave sudo: hdfs: command not found [root@MASTER ~]#
... View more
06-21-2021
12:26 AM
Hi @ChethanYM, I tried the following command again another error [root@MASTER ~]# su - hdfs Last login: Sun Jun 20 17:27:39 UTC 2021 on pts/5 [hdfs@MASTER ~]$ sudo -u hdfs hdfs dfsadmin -safemode leave sudo: hdfs: command not found [hdfs@MASTER ~]$ id hdfs uid=30008(hdfs) gid=30009(hdfs) groups=30009(hdfs) [hdfs@MASTER ~]$
... View more
06-20-2021
10:48 AM
Hi @ChethanYM , when i created hdfs user i didn't create with a password but when i execute this is what i'm asked. i used the root password but it doesn't pass either. [hdfs@MASTER ~]$ hdfs dfsadmin -safemode leave 21/06/20 17:30:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable safemode: Access denied for user hdfs. Superuser privilege is required [hdfs@MASTER ~]$ sudo hdfs dfsadmin -safemode leave [sudo] password for hdfs: sorry, try again. [sudo] password for hdfs: sorry, try again. [sudo] password for hdfs:
... View more
06-20-2021
07:04 AM
When I execute this command this is the output [hdfs@MASTER ~]$ hdfs dfs -mkdir /user 21/06/20 13:51:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable mkdir: Cannot create directory /user. Name node is in safe mode
... View more
06-20-2021
04:26 AM
I installed hadoop-2.7.1 on a centos7 operating system. when I execute the command # hdfs dfs -mkdir /user/Juste . I get an error [hdfs@MASTER ~]$ hdfs dfs -mkdir /user/Juste 21/06/20 11:04:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable mkdir: `/user/Juste': No such file or directory
... View more
Labels:
- Labels:
-
Apache Hadoop
-
HDFS
06-17-2021
12:24 PM
Hi Everyone,The virtualization system I use is VMWare I use HDP2.3 Sandbox, Centos release7 as OS, with hadoop2.7.1. J'ai installé hadoop sur Sandbox, l'installation s'est bien passée. Lorsque j'exécute la commande suivante #./start-all.sh , j'ai un problème de refus de connexion de ssh. [root@MASTER sbin]# ./start-all.sh Ce script est obsolète. Au lieu d' utiliser start-dfs.sh et start-yarn.sh 21/06/17 19:06:27 WARN util.NativeCodeLoader: Impossible de charger la bibliothèque native Hadoop pour votre plate - forme ... en utilisant des classes java-builtin le cas échéant namenodes de départ sur [MASTER] mot de passe de root@master : MASTER : démarrage de namenode, connexion à /usr/local/hadoop/logs/hadoop-root-namenode-MASTER.out localhost : ssh : connexion à l'hôte localhost port 22 : connexion refusée Démarrage des namenodes secondaires [0.0.0.0] 0.0.0.0 : ssh : se connecter à l'hôte 0.0.0.0 port 22 : Connexion refusée 21/06/17 19:06:49 WARN util.NativeCodeLoader : Impossible de charger la bibliothèque native-hadoop pour votre plateforme... en utilisant des classes intégrées java, le cas échéant démarrage des démons fil démarrage du gestionnaire de ressources, connexion à /usr/local/hadoop/logs/yarn-root-resourcemanager-MASTER.out localhost : ssh : se connecter à l'hôte localhost port 22 : connexion refusée
... View more
Labels:
- Labels:
-
HDFS
-
Hortonworks Data Platform (HDP)
06-14-2021
09:01 AM
Hi @MattWho by combine i meant drag a connection from GetFile to ExecuteProcess or ExecuteStreamCommand i have a flowfile which is composed of GetFile processor which collects the data and passes it to MergeContent processor to get a single file of several lines. i want to filter its lines and then classify them . and i was thinking of using ExecuteProcess or ExecuteStreamCommand. you can also give me a suggestion.
... View more
06-14-2021
02:52 AM
Hi Everyone, can we combine the nifi processors in this sense GetFile-->ExecuteProcess or GetFile --> ExecuteStreamCommand if yes how can we do it?
... View more
Labels:
- Labels:
-
Apache NiFi
06-10-2021
05:16 AM
If I have several data sources on the same server can I configure ListSFTP and FetchSFTP to collect the data at the same time? if yes how can we do it. If I have several servers on which I have to collect data, can I configure ListSFTP and FetchSFTP to collect the data at the same time? if yes how can we do it.
... View more
Labels:
- Labels:
-
Apache MiNiFi
-
Apache NiFi
06-10-2021
05:05 AM
try this: open your cmd: change directory with : cd C:\kafka_2.11-2.0.0 if you change directory you will do this C:\kafka_2.11-2.0.0> bin\zookeeper-server-start.bat config\zookeeper.properties C:\kafka_2.11-2.0.0> bin\kafka-server-start.bat config\server.properties if you don't change directory do this. >C:\kafka_2.11-2.0.0\bin\zookeper-server-start.bat C:\kafka_2.11-2.0.0\config\zookeeper.properties > C:\kafka_2.11-2.0.0\bin\kafka-server-start.bat C:\kafka_2.11-2.0.0\config\server.properties
... View more
06-04-2021
01:00 AM
Hi Everyone,I want to have the total sum of the lines of several files that I calculated. I used processors : GetFTP-->CountText-->ReplaceText-->MergeContent-->QueryRecord. now I have error in the configuration of QueryRecord. How can I configure QueryRecord? If you can give me a suggestion to simplify the pipeline it would be great. here is the error : 07:26:14 WEST ERROR QueryRecord[id=d2555423-0179-1000-01c5-69175e4b54cc] Unable to query StandardFlowFileRecord[uuid=96594164-f103-4d34-949a-2c5a50b9aa00,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1622787564383-4, container=default, section=4], offset=27694, length=2],offset=0,name=location.csv,size=2] due to java.sql.SQLException:Error while preparing statement [SELECT SUM(cnt) as cnt FROM FLOWFILE] org.apache.nifi.processor.exception.ProcessException: java.sql.SQLException:Error while preparing statement [SELECT SUM(cnt) as cnt FROM FLOWFILE] QueryRecord AdCSVREADER AdCSVRECORDSETWRITER
... View more
- Tags:
- NiFi
- QueryRecord
Labels:
- Labels:
-
Apache NiFi
06-01-2021
02:19 PM
Hi @TimothySpann, What is the difference between CountText and QueryRecord. Can you detail the use of QueryRecord? Thanks
... View more
06-01-2021
08:22 AM
Hi @ckumar, Thanks for your help but I had used CountText but the problem is that I have csv files and I only see the number of files instead of counting the number of lines in the files.
... View more
06-01-2021
03:46 AM
Hi Everyone, I collect data files with NiFi using ListSFTP AND FetchSFTP processors. In these data files I have several different lines. I want to know if there is a processor that can allow me to count the number of lines in the files as I collect them to have a total number of lines. Someone can help me.
... View more
Labels:
- Labels:
-
Apache NiFi
05-12-2021
01:18 PM
Hi Everyone, I collect the data with NiFi to inject in Kafka and then I want to send them to Spark for processing.The data arrive well in kafka but the problem is from kafka to spark. I ran zookeeper server and then kafka server. I need help
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache NiFi
-
Apache Spark
05-06-2021
05:10 PM
Hi@MattWho, I have a folder of files. Here is the configuration I tried to set up but the problem is on remote filename I tried: remote filename = ${/home/Data/} remote fiename = /home/Data/, but both do not work. ListSFTP settings FetchSFTP settings
... View more
05-06-2021
04:31 AM
I am trying to collect data with NiFi using ListSFTP+FetchSFTP. But in the configuration ListSFTP works correctly but I have problems with the configuration of FetchSFTP. I tried with ${path}/${filename} still the problem persists,I also used only the data path, but that doesn't work either . the problem also I have a folder of several files.
... View more
- Tags:
- configuration
- NiFi
Labels:
- Labels:
-
Apache NiFi
05-03-2021
01:15 PM
Hi@MattWho,Thanks for the explanation of MiNiFi. When I execute the command (./nifi.sh start) and then the command (tail -f nifi-app.log) , I get these two lines in the output. 2021-05-03 20:30:07,021 INFO [main] org.apache.nifi.web.server.JettyServer NiFi has started. The UI is available at the following URLs:
2021-05-03 20:30:07,021 INFO [main] org.apache.nifi.web.server.JettyServer https://<hostname/IP>:<port>/nifi when I launch http://ip:8080. the message is: This site is inaccessible.
... View more
05-02-2021
02:53 AM
I have installed NiFi and MiNiFi on a remote unix server. When I run the command: tail -f nifi-app.log I get all repositories: content_repository, database_repository, flowfile_repository, provenance_repository for NiFi. quamd I run the same command for MiNiFi: tail -f minifi-app.log, get three repositories: content_repository, flowfile_repository, provenance_repository.
the installation was successful. I run the command: http://ip:8080. I can't get the NiFi interface, I'm told that this page is not accessible. I do the same thing for MiNiFi and I don't get interface. this page is not accessible. i need help
... View more
Labels:
- Labels:
-
Apache MiNiFi
-
Apache NiFi