Hi have multiple process groups for Data Extractions (one processgroup for Oracle, one for MySQL) which are converted to csv and later are fed to a big data processing engine as source files. I want to implement a condition trigger by reading a file (which tells me Oracle or MySQL), based on the content if this file, I need to trigger the corresponding data extraction ProcessGroup. Note : I dont want to do any action manually on the UI, this has to be an automated solution, which will work based on the file data which says, which ProcessGroup should be triggered (Oracle or MySQL). Both should not run at the same time. How can I implement this ? Any help will be highly appreciated.
... View more
I have a requirement of loading Data from MySQL to Salesforce (using Salesforce BLUK API) using Nifi. For that I am first calling a Salesforce SOAP API to get session_id(accessToken), then I am passing this session_id to another REST API to create a BULK Load job on Salesforce and I am extracting the job_id and adding both the session_id and job_id into the attribute of the flow file. Till now everything was fine, but now I need to extract the data from MySQL Table using QueryDatabaseTable processor which doesn't accepts any upstream link, but I need to use the session_id and job_id to pass it along with the MySQL Data to Salesforce BULK API REST call using InvokeHTTP, so now I have two disconnected flows, one has the job_id and session_id to be passed to Salesforce REST API and the other flow has the Data from MySQL DB, but I am unable to use both of these in the InvokeHTTP. Its a very tricky situation. Is there any way, that I can store the session_id and job_id from the 1st flow inside the flowfile attributes into some variable and use them while calling the REST API when invoking the REST API along with the Data extracted from MySQL ? Thanks AA24
... View more