Member since
01-15-2019
55
Posts
36
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1967 | 07-20-2021 01:05 AM | |
13886 | 11-28-2019 06:59 AM |
11-15-2023
06:28 PM
What's your OS version? If you use CentOS Stream release, you will get this error.
... View more
05-09-2023
04:08 PM
1 Kudo
NiFi - Oracle Connect
Processor: ExecuteSQL
1) Prepare Oracle Data (Optional)
Prepare the data, and create the Oracle table:
CREATE TABLE USER_MASTER
(
USER_ID VARCHAR2(8) NOT NULL,
DEPT_NO VARCHAR2(8),
USER_NAME VARCHAR2(32),
CREATED_ON DATE DEFAULT SYSDATE,
MODIFIED_ON DATE,
CONSTRAINT pk_USER_MASTER PRIMARY KEY(USER_ID)
);
Insert the data into Oracle Table:
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 1 Ro','1001','1');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 2 Ro','1001','2');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 3 Ro','1001','3');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 4 Ro','1001','4');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 5 Ro','1001','5');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 6 Ro','1001','6');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 7 Ro','1001','7');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 8 Ro','1001','8');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 9 Ro','1001','9');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 10 Ro','1001','10');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 11 Ro','1001','11');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 12 Ro','1001','12');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 13 Ro','1001','13');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 14 Ro','1001','14');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 15 Ro','1001','15');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 16 Ro','1001','16');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 17 Ro','1001','17');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 18 Ro','1001','18');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 19 Ro','1001','19');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 20 Ro','1001','20');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 21 Ro','1001','21');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 22 Ro','1001','22');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 23 Ro','1001','23');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 24 Ro','1001','24');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 25 Ro','1001','25');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 26 Ro','1001','26');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 27 Ro','1001','27');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 28 Ro','1001','28');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 29 Ro','1001','29');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 30 Ro','1001','30');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 31 Ro','1001','31');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 32 Ro','1001','32');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 33 Ro','1001','33');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 34 Ro','1001','34');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 35 Ro','1001','35');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 36 Ro','1001','36');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 37 Ro','1001','37');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 38 Ro','1001','38');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 39 Ro','1001','39');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 40 Ro','1001','40');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 41 Ro','1001','41');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 42 Ro','1001','42');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 43 Ro','1001','43');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 44 Ro','1001','44');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 45 Ro','1001','45');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 46 Ro','1001','46');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 47 Ro','1001','47');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 48 Ro','1001','48');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 49 Ro','1001','49');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 50 Ro','1001','50');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 51 Ro','1001','51');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 52 Ro','1001','52');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 53 Ro','1001','53');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 54 Ro','1001','54');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 55 Ro','1001','55');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 56 Ro','1001','56');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 57 Ro','1001','57');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 58 Ro','1001','58');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 59 Ro','1001','59');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 60 Ro','1001','60');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 61 Ro','1001','61');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 62 Ro','1001','62');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 63 Ro','1001','63');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 64 Ro','1001','64');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 65 Ro','1001','65');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 66 Ro','1001','66');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 67 Ro','1001','67');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 68 Ro','1001','68');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 69 Ro','1001','69');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 70 Ro','1001','70');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 71 Ro','1001','71');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 72 Ro','1001','72');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 73 Ro','1001','73');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 74 Ro','1001','74');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 75 Ro','1001','75');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 76 Ro','1001','76');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 77 Ro','1001','77');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 78 Ro','1001','78');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 79 Ro','1001','79');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 80 Ro','1001','80');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 81 Ro','1001','81');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 82 Ro','1001','82');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 83 Ro','1001','83');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 84 Ro','1001','84');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 85 Ro','1001','85');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 86 Ro','1001','86');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 87 Ro','1001','87');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 88 Ro','1001','88');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 89 Ro','1001','89');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 90 Ro','1001','90');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 91 Ro','1001','91');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 92 Ro','1001','92');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 93 Ro','1001','93');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 94 Ro','1001','94');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 95 Ro','1001','95');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 96 Ro','1001','96');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 97 Ro','1001','97');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 98 Ro','1001','98');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 99 Ro','1001','99');
INSERT INTO USER_MASTER (USER_NAME, DEPT_NO, USER_ID) VALUES ('Hayashi 100 Ro','1001','100');
2) Add Processor
Add an ExecuteSQL Processor :
With the following input parameters:
select * from user_master
Create the DBCPConnectionPool Controller Service:
jdbc:oracle:thin:@zzeng-*******.com:1521:svcname
oracle.jdbc.driver.OracleDriver
/opt/nifi/jdbc/ojdbc8.jar
Be careful about the "svcname" above, you must input the correct Oracle service name.
Download JDBC Driver:
To get the ojdbc8.jar,
go to https://www.oracle.com/database/technologies/appdev/jdbc-downloads.html
wget https://download.oracle.com/otn-pub/otn_software/jdbc/219/ojdbc8-full.tar.gz
tar xvf ojdbc8-full.tar.gz
sudo mkdir -p /opt/nifi/jdbc/
sudo chown nifi:nifi /opt/nifi/jdbc/
sudo chmod 755 /opt/nifi/jdbc/
cd /opt/nifi/jdbc/
sudo cp /home/centos/oracle-connect/ojdbc8.jar .
3) Run
Run once:
Check Provenance
4) Write to HDFS
Hadoop Configuration Resources:
/etc/hadoop/conf/hdfs-site.xml,/etc/hadoop/conf/core-site.xml
Directory:
/tmp/nifi/
Check result:
[centos@cdp conf]$ hdfs dfs -ls /tmp/nifi/
Found 2 items
-rw-r--r-- 1 nifi supergroup 5268 2023-05-10 04:52 /tmp/nifi/78338e0b-27b4-4b44-b406-c5b1cada12eb
-rw-r--r-- 1 nifi supergroup 5268 2023-05-10 05:16 /tmp/nifi/ca5756c4-0e9e-480c-abf5-050b731493fb
[centos@cdp conf]$
5) Write to Hive External Table
1)Get data from Oracle(Avro Format output) ExecuteSQL Processor 2)Convert from Avro to Parquet
Use ConvertAvroToParquet Processor 3)Write Parquet data to the external Hive Table
PuTHDFS
Prepare Hive Table:
CREATE EXTERNAL TABLE DEMO.USER_MASTER
(
USER_ID STRING,
DEPT_NO STRING,
USER_NAME STRING,
CREATED_ON STRING ,
MODIFIED_ON STRING
)
STORED AS PARQUET
LOCATION '/tmp/data/parquet';
... View more
Labels:
01-12-2023
02:16 AM
2 Kudos
To install Apache NiFi on Macbook, do the following:
Download the Zip file from Apache nifi Downloads. Ensure that you download the latest version zip file Apache NiFi Binary 1.19.1 [OpenPGP] [SHA-256] [SHA-512].
Download it to MBP, and then unzip it:
chmod +x bin/nifi.sh
Run the following command for User Generate:
bin/nifi.sh set-single-user-credentials zzeng <my long password>
Run the following command to Start:
bin/nifi.sh start bin/nifi.sh status netstat -an | grep LIST
Check the 8443 port is LISTEN:
tcp4 0 0 127.0.0.1.8443 *.* LISTEN
Login from https://localhost:8443/nifi/
... View more
Labels:
11-13-2022
08:07 AM
https://lists.apache.org/thread/907n11xlvdmckp1045bspzloclthfqsh As NiFi is a pure Java/JVM application, we use Jython rather than
Python for ExecuteScript. This means that you can't import native
(CPython, e.g.) modules into your Jython scripts in ExecuteScript consider using ExecuteStreamCommand with a real Python
interpreter and script. I'm looking at Py4J to try and bridge the gap
... View more
04-10-2022
10:47 PM
The latest document about Airflow in Data Engineering Experience (CDE) within Public Cloud and CDP Private Cloud Data Service is here: CDP Public Cloud https://docs.cloudera.com/data-engineering/cloud/orchestrate-workflows/topics/cde-airflow-dag-pipeline.html CDP Private Cloud Data Service https://docs.cloudera.com/data-engineering/1.3.3/orchestrate-workflows/topics/cde-airflow-dag-pipeline.html
... View more
07-20-2021
01:05 AM
Sorry for my late reply. > what AMI are you using and were there any pre-requisites for OS ? I used the CentOS 7.8, like the matrix showed. https://supportmatrix.cloudera.com/#Cloudera To download the CDP, if you are not paying for CDP, you can try it from here. https://docs.cloudera.com/cdp-private-cloud/latest/release-guide/topics/cdpdc-trial-download-information.html The trial version has the same functions with normal CDP but can only use 60 days. Please have a try and let me know if it works.
... View more
07-09-2021
12:01 AM
Hi Sam, I am also use CDP on AWS for learning. If you don't have enough AWS credit budget, you can use 3 nodes in this way, Use these 3 nodes as both Master and worker nodes. Means put NameNode and DataNode in these nodes. Put DataNode in all the 3 nodes.
... View more
07-08-2021
11:57 PM
I met similar errors, but with different reasons. My reason was, I created a leaf queue behind default queue. Then after Hive restart, it will submit a query to check the health status, however as default queue is not a leaf queue any more, then it went error. My solution was , I removed the child queue behind default queue, then the error resolved.
... View more
04-20-2021
05:06 AM
Here is a URL, written by David W. Streever https://www.streever.com/post/2019/filter-hive-compactions/ It's not official steps. All the ‘extras’ you create against the metastore DB can/may break with the next release. This isn’t a supported method of accessing metadata.
... View more
01-26-2021
08:51 AM
https://nifi.apache.org/docs/nifi-docs/components/nifi-docs/components/org.apache.nifi/nifi-gcp-nar/1.9.0/org.apache.nifi.processors.gcp.bigquery.PutBigQueryBatch/index.html With this processor, you can do Batch loads flow files content to a Google BigQuery table.
... View more