Member since 
    
	
		
		
		05-31-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                12
            
            
                Posts
            
        
                4
            
            
                Kudos Received
            
        
                3
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 4827 | 03-21-2018 04:56 PM | |
| 4610 | 10-02-2017 07:37 PM | |
| 7015 | 02-03-2017 02:40 PM | 
			
    
	
		
		
		03-21-2018
	
		
		04:56 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 All,    I've resolved the issue - in the file "src/main/resources/META-INF/services/org.apache.nifi.controller.ControllerService" I previously had it as "org.apache.nifi.phoenix.service.PhoenixDBCPService" and after changing it to "org.apache.nifi.phoenix.service.PhoenixConnectionPool" NiFi was able to startup and I was able to use/access my custom processors. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-21-2018
	
		
		04:05 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello,    I am developing a custom controller service to interact with Apache Phoenix. After building the NAR files I faced an InstantiationException when deploying the NARs and restarting NiFi.     Could anyone with experience developing custom processors and especially custom controller services take a look at the below error output and help point me in the right direction? I have also attached the poms as well as the bundled dependencies     Thanks in advance!!!  Here is the error output:  ERROR [main] org.apache.nifi.NiFi Failure to launch NiFi due to 
java.util.ServiceConfigurationError:
org.apache.nifi.controller.ControllerService: Provider 
org.apache.nifi.phoenix.service.PhoenixDBCPService could not be 
instantiatedjava.util.ServiceConfigurationError:
 org.apache.nifi.controller.ControllerService: Provider 
org.apache.nifi.phoenix.service.PhoenixDBCPService
could not be instantiated  at java.util.ServiceLoader.fail(ServiceLoader.java:232)  at java.util.ServiceLoader.access$100(ServiceLoader.java:185)  at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)  at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)  at java.util.ServiceLoader$1.next(ServiceLoader.java:480)  at org.apache.nifi.nar.ExtensionManager.loadExtensions(ExtensionManager.java:142)  at org.apache.nifi.nar.ExtensionManager.discoverExtensions(ExtensionManager.java:117)  at org.apache.nifi.web.server.JettyServer.start(JettyServer.java:771)  at org.apache.nifi.NiFi.<init>(NiFi.java:160)  at     org.apache.nifi.NiFi.main(NiFi.java:268)Caused by: java.lang.InstantiationException: org.apache.nifi.phoenix.service.PhoenixDBCPService  at java.lang.Class.newInstance(Class.java:427)  at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)  ... 7 common frames omitted  Caused by: java.lang.NoSuchMethodException: org.apache.nifi.phoenix.service.PhoenixDBCPService.<init>()  at java.lang.Class.getConstructor0(Class.java:3082)  at java.lang.Class.newInstance(Class.java:412)  ... 8 common frames omitted     bundled-dependencies-phoenix-service.txt  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache NiFi
			
    
	
		
		
		10-04-2017
	
		
		07:37 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Just commenting on this for future visitors to this post:  Only files that end in ".jar" are picked up by the Driver Class Loader,  Here's the relevant source code from  DBCPConnectionPool.java         protected ClassLoader getDriverClassLoader(String locationString, String drvName) throws InitializationException {            if (locationString != null && locationString.length() > 0) {                try {                    // Split and trim the entries                    final ClassLoader classLoader = ClassLoaderUtils.getCustomClassLoader(                            locationString,                            this.getClass().getClassLoader(),                            (dir, name) -> name != null && name.endsWith(".jar")    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-03-2017
	
		
		11:17 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Setting concurrent tasks to 10 will give the processor the ability to request up to 10 threads from the NiFi controller, so that 10 flowfiles can be processed concurrently.  However, since you have NiFi running on a single node, an individual flowfile will not be processed in parallel - i.e. - subsets of the flowfile's data are not processed independently in parallel.  If you have NiFi running in a cluster with more than 1 node, the data can be divided among the nodes and processed in parallel. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-02-2017
	
		
		11:42 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 For the ExecuteStreamCommand, 1 task would execute your shell script with the attribute value from 1 flowfile. If the "Concurrent Tasks" setting was bumped up to 10 then 10 tasks would be executed concurrently, i.e - your shell script would be executed 10 times in parallel with each execution being executed with the attribute from 1 of the 10 flowfiles.  Certain processors will process multiple flowfiles at the same time without having to increase the value for "Concurrent Tasks". One example is the PutSQL processor which can batch multiple Insert/Update statements (flowfiles) in 1 database transaction.  However, for the ExecuteStreamCommand processor, 1 flowfile = 1 task and flowfiles will be processed sequentially. If a flowfile is still being processed by the processor and "Concurrent Tasks" = 1, the other 9 flowfiles will wait in the queue.  Let me know if that clears things up! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-02-2017
	
		
		07:37 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi John,  One idea for acheiving your scenario would be the following flow:  SelectHiveQL -> ConvertAvrotoJSON -> SplitText -> EvaluateJSONPath -> ExecuteStreamCommand  or  SelectHiveQL -> SplitAvro -> ConvertAvrotoJSON  -> EvaluateJSONPath -> ExecuteStreamCommand   Your hive query runs and returns the result set as a flowfile with avro format.   Convert the flowfile from avro to json  Split the JSON flowfile into multiple flowfiles, with 1 flowfile per row in result set. To do this in the SplitText processor, set the property "Line Split Count" to 1  Use the EvaluateJSONPath processor to extract the value from the json object and write it to the flowfile's attribute.  In the ExecuteStreamCommand processor, pass the values extracted from the EvaluateJSONPath processor to the property "Command Arguments" using NiFi's Expression Language - example: ${attribute_name}   To increase the number of times the ExecuteStreamCommand processor can be executed concurrently, you can adjust the "Concurrent Tasks" setting on the processor's scheduling tab. However, this is not a dynamic property - i.e - if today, "Concurrent Tasks" is set to 10 and tomorrow 11 rows are returned from the hive query only 10 of the 11 flowfiles will be executed concurrently. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-02-2017
	
		
		06:59 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 Hello,  I am having issues creating a DBCP controller service that connects to Phoenix on a kerberized cluster. I have validated the jdbc connection string by connecting via sqlline.  Here is the error message I see when testing the connection by executing an ExecuteSQL processor (select * from $tbl) with the configured Phoenix DBCP controller service:   org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
java.net.SocketTimeoutException: callTimeout=60000, callDuration=68422: row 'SYSTEM:CATALOG,,' on table 'hbase:meta' at region=hbase:meta   My understanding of the above error is that this often occurs with issues related to authentication and in particular when the hbase-site.xml is not available in the application's classpath.  My questions is, how do I make the configuration resources (hbase-site.xml, core-site.xml, hdfs-site.xml) available in the classpath for the DBCP controller service?  For example, in the HiveConnectionPool controller service, there is a property "Hadoop Configuration Resources" where the hive-site.xml, core-site.xml, and hdfs-site.xml locations can be specified.  I initially tried listing the configuration files locations in the property "Database Driver Location(s)" however that did not work, and after looking at the source code for the DBCPConnectionPool service it appears that the method "getDriverClassLoader" only attempts to load files that end in ".jar". Relevant source code: (line 233: (dir, name) -> name !=null && name.endsWith(".jar") --- DBCPConnectionPool.java)  My next idea is to add these configuration files to either of the 2 following locations:  
 NiFi's conf directory (<nifi_root_path>/nifi/conf)   DBCP's nar work directory (<nifi_root_path>/nifi/work/nar/extensions/nifi-dbcp-service-nar-1.0.0.2.0.1.0-12.nar-unpacked/META-INF/bundled-dependencies).    However, I don't know if either two options make sense or if this would have to repeated every time an upgrade occurs.  Does anyone have any suggestions for adding configuration resources to the DBCP controller service's classpath? Or any general suggestions on how to make jdbc connections to Phoenix on kerberized clusters via NiFi's DBCP controller service?  Any help would be greatly appreciated!  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache NiFi
- 
						
							
		
			Apache Phoenix
			
    
	
		
		
		03-13-2017
	
		
		02:55 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi,  I'm looking for any suggestions for optimizing the speed of a bulk insert into an Oracle table.  In the flow, we are querying a Hive table then converting the results to JSON, then splitting that flowfile line by line using SplitText. The result is a flowfile for each row in the hive result set (1 flowfile ---> many flowfiles).  Then, each flowfile is turned into a sql insert statement that inserts a single row into an Oracle table.   What I'm trying to figure out is a more efficient way of inserting into the Oracle table. I've tried manipulating the "batch size" and "concurrent tasks" properties on the PutSQL processor but this has had negligible effects on the performance.   Can anyone suggest an alternative approach or point out something that I may have overlooked?  Any help would be greatly appreciated!  Thanks! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache NiFi
			
    
	
		
		
		02-03-2017
	
		
		02:40 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @jstorck   Thanks for the response, but I was actually able to fix the problem a few days ago - I should have posted my solution.  The user was configured to to login and access a specific filepath. In my processor configuration I had the "remote file" property set to the full path as /path/to/file/${filename} when all I needed was ${filename}. But thanks for the response! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-27-2017
	
		
		04:57 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello,  In my NiFi flow, I am using a FetchSFTP processor to get files from a server. The first day I configured the processor it was working fine and was able to connect to the server and retrieve the files. However since then I have been getting the following error from the processor:  java.io.IOException: Failed to obtain connection to remote host due to com.jcraft.jsch.JSchException: session is down  No changes have been made to the processor's configuration and no changes have been made to the SFTP server. I am also still able to connect to the SFTP server via WinSCP; I am only having an issue with the FetchSFTP processor.  If anyone could provide some help towards debugging this issue it would be greatly appreciated! Thanks! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache NiFi
 
        








