Member since 
    
	
		
		
		09-29-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                31
            
            
                Posts
            
        
                34
            
            
                Kudos Received
            
        
                18
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 1768 | 08-07-2018 11:16 PM | |
| 4220 | 03-14-2018 02:56 PM | |
| 4309 | 06-15-2017 10:13 PM | |
| 11371 | 06-05-2017 01:40 PM | |
| 7961 | 05-17-2017 02:52 PM | 
			
    
	
		
		
		09-14-2020
	
		
		12:49 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I have put detailed solution description here     You should be able to start NiFi & its registry with this docker compose file. It persists several folders on the local disk & rest will be persisted in the docker volume called 'nifi_data' so that it retains the previous state of other folders like content_repository, database_repository, flowfile_repository, provenance_repository, state  and work. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-01-2020
	
		
		08:43 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Is this (now) considered a NiFi "anti-pattern"? Do you have any idea how to do this using NiFi Record serialization services? I'm under the impression that creating thousands of content files is not the best practice by today's standards, but I'm not sure how to use InvokeHTTP on a full set of records without splitting it into many flowfiles. Any ideas? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-14-2018
	
		
		02:56 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @Akananda Singhania,  I suspect your network configuration on your Docker Engine host is incorrect.  Running the image you listed works as anticipated in a few of the environments available to me.  Let's try to confirm this suspicion by running the following:  docker run busybox ping -c 1 files.grouplens.org
  You should receive output similar to the following.  If not, the configured DNS server is not appropriately routing to external sites.  PING files.grouplens.org (128.101.34.235): 56 data bytes
64 bytes from 128.101.34.235: seq=0 ttl=37 time=39.263 ms
--- files.grouplens.org ping statistics ---
1 packets transmitted, 1 packets received, 0% packet loss
round-trip min/avg/max = 39.263/39.263/39.263 ms  Could you provide more details about your environment in which you are running Docker?  Of interest would be the output of   cat /etc/resolv.conf  Another option is to try explicitly specifying a DNS server such as those that Google makes available via a command such as:  docker run --dns 8.8.8.8 -d -p 8080:8080 apache/nifi 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-17-2019
	
		
		02:57 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi, I have a poc kubernetes setup at  https://github.com/whs-dot-hk/kubernetes-nifi-refined  which also works for docker and docker compose 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-30-2017
	
		
		07:41 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 It will be great if you can help with examples for running powershell scripts.It would be a great help to me 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-17-2016
	
		
		01:35 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Aldrin Piri  Appreciate on opening the Jira to handle this.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		11-16-2017
	
		
		09:18 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @Aldrin Piri, @omer alvi ,I am using GetHTTPS to ingest data from a Facebook page and sending it to PutHDFS. The roadblock is that once the data from that page is ingested and I stop the process, that particular url is not hit again when i restart the process. What are the modifications I can make to my url such that the process of data ingestion is a continuous process?  This is a sample of the url I am currently using :  https://graph.facebook.com/v2.11/542678889114683/?fields=name,likes,posts&access_token="my_access_token"&limit=100 
						
					
					... View more