Member since
10-31-2018
6
Posts
0
Kudos Received
0
Solutions
12-17-2018
08:02 PM
Hi @Pratik Ghatak @Rudolf Schimmel @Timothy Spann I am getting the similar error while connecting to Hive using DBCPConnectionPool in ExecuteSQL processor in NiFi 1.7.1 Hive version : Hive 1.1.0-cdh5.12.0 My DBCP Connection Pool: DB Connection URL: jdbc:hive2://MyServer:10000/default;AuthMech=GSSAPI;KrbRealm=HADOOP.TEST.COMPANY.COM;KrbHostFQDN=_HOST;KrbServiceName=hive DB Driver Location: /opt/flow/nifi/lib/hive-jdbc-1.1.0-cdh5.12.0.jar, /opt/flow/nifi/lib/hive-service-1.1.0-cdh5.12.0.jar, /opt/flow/nifi/lib/libthrift-0.9.3.jar DB Driver Class: org.apache.hive.jdbc.Hive Driver Database User : No value set Password : No value set Further, I have added below property in hive-site.xml <property> <name>hive.server2.transport.mode</name> <value>binary</value> <property> If possible, could you please share your working DBCPConnection Pool settings. I am able to connect to Hive in Python using pyhs2 and impala.dbapi modules providing pyhs2.connect(host='myserver',port=10000,authMechanism='KERBEROS') conn=connect(host='myserver', port=10000, kerberos_service_name='hive', auth_mechanism='GSSAPI') Kindly advise. Thanks so much.
... View more
12-08-2018
09:18 PM
Thanks for your help, I got it working.
... View more
11-05-2018
05:01 PM
Thanks for the quick reply @Shu I changed the above property value but getting the below error:
... View more
11-03-2018
12:32 AM
Hi @Andrew Lim @Matt Burgess @Timothy Spann I am trying to load a simple CSV file (without header) having few rows to Teradata. Further, I am trying to create the table in Teradata by providing the CREATE query in the PutDatabaseRecord processor. My workflow is: GetFile -> UpdateAttribute -> PutDatabaseRecord This is how data looks like: 1,18 2,19 3,20 4,21 5,22 This is how the workflow looks like: This is how the UpdateAttribute processor looks like: This is how the PutDatabaseRecord processor looks like: This is how CSVReader looks like: This is how AvroSchemaRegistry looks like: But, I am getting the below error: org.apache.nifi.schema.access.SchemaNotFoundException:Unable to find schema with name 'BIGDATA_DL.STUDENT' Please advise.
... View more
Labels:
- Labels:
-
Apache NiFi
11-01-2018
04:55 AM
Hi @Andrew Lim @Kunal Gaikwad I have tried the workflow (csv-to-mysql.xml) that you've attached. My use case is to insert few rows into teradata instead of mysql. I've kept everything same as yours, just that the table name in Teradata (bigdata_dl.acct), DBCP Connection Pool for Teradata and datatype in Teradata is different. But, I am getting the below error: org.apache.nifi.schema.acess.SchemaNotFoundException:Unable to find schema with name 'bigdata_dl.acct' error.jpeg I've created the table in teradata before running the workflow: 2.png This is how the UpdateAttribute processor looks like: 1.jpeg This is how CSVReader looks like: 3.jpeg This is how AvroSchemaRegistry looks like: 4.jpeg if I provide the "test" field in the AvroSchemaRegistry as below (to match the datatype in Teradata as it doesn't supports long or string) it shows me State as Invalid stating not a valid Avro Schema: Illegal character in: varcher(30) {
"type": "record", "name": "TestRecord", "fields" : [
{"name": "ACCT_ID", "type": "varchar(30)"}, {"name": "ACCT_NAME", "type": "varchar(30)"}
] } This is how PutDatabaseRecord looks like: Please advise. Appreciate your help! Happy Halloween 🙂
... View more
10-31-2018
10:42 PM
Hi @Matt Burgess I am trying to insert few sample records (.csv) to Teradata using NiFi. My current workflow is same as you suggested GetFile -> PutDatabaseRecord but I am getting the error as below. Please advise. Appreciate your help! 3.png Please find the configurations for the 2 processors: 4.png 6.png 7.png This is how data looks like: ACCT_ID,ACCT_NAME 1,A 2,B Table definition: create table bigdata_dl.acct( ACCT_ID VARCHAR(30), ACCT_NAME VARCHAR(30) );
... View more