Created on 02-22-2018 10:06 AM - edited 08-17-2019 07:03 PM
I have a simple CSV file and the content of the file is as follows:
1,QWER 2,TYUI 3,ASDF 4,GHJK 5,ZXCV
I want to move the content of this file into a MYSQL table, hence i have created a the following flow (refer to 1.png)
I have configured the PutDatabaseRecord processor as (refer 2.png )
Also, the CSVReader looks like (refer 3.png)
I am getting error refer 4.png
Can you help me configure my CSVReader I guess its because of it I am not able to push my csv records in to mysql table
Any help is appreciated.
Reference link:
https://community.hortonworks.com/questions/102559/please-suggest-me-stepshow-i-can-insert-csv-filei...
Created on 02-22-2018 04:50 PM - edited 08-17-2019 07:03 PM
I used PutDatabaseRecord with a CSVReader, but have a slightly different flow that uses an UpdateAttribute processor to set the schema. So the flow is GetFile-->UpdateAttribute-->PutDatabaseRecord. Here it is as an attachment:
UpdateAttribute adds the attribute "schema.name" with the value of "test".
CSVReader:
AvroSchemaRegistry:
PutDatabaseRecord:
Here are the results in my database:
Created on 02-22-2018 04:50 PM - edited 08-17-2019 07:03 PM
I used PutDatabaseRecord with a CSVReader, but have a slightly different flow that uses an UpdateAttribute processor to set the schema. So the flow is GetFile-->UpdateAttribute-->PutDatabaseRecord. Here it is as an attachment:
UpdateAttribute adds the attribute "schema.name" with the value of "test".
CSVReader:
AvroSchemaRegistry:
PutDatabaseRecord:
Here are the results in my database:
Created 01-13-2020 08:31 PM
Hi,
I followed your step.But i am getting error "due to record does not have value for the required column CURRENT_CONNECTION "
In Mysql schema and database are same.
In your example what is test_db and what is test.
I am struggling on this on last 3 days.
Please help me to resolve this issue.
Thank,
John
Created 08-30-2020 10:33 PM
Created 02-23-2018 05:43 AM
@AndrewLim
Worked like a charm. Looking at your response, I figured that I never defined schema.name and hence the errors! Thank you for the prompt response!
Created 02-23-2018 02:44 PM
Great! Glad you got your flow working.
Created 11-01-2018 04:55 AM
I have tried the workflow (csv-to-mysql.xml) that you've attached. My use case is to insert few rows into teradata instead of mysql.
I've kept everything same as yours, just that the table name in Teradata (bigdata_dl.acct), DBCP Connection Pool for Teradata and datatype in Teradata is different.
But, I am getting the below error:
org.apache.nifi.schema.acess.SchemaNotFoundException:Unable to find schema with name 'bigdata_dl.acct'
I've created the table in teradata before running the workflow:
This is how the UpdateAttribute processor looks like:
This is how CSVReader looks like:
This is how AvroSchemaRegistry looks like:
if I provide the "test" field in the AvroSchemaRegistry as below (to match the datatype in Teradata as it doesn't supports long or string) it shows me State as Invalid stating not a valid Avro Schema: Illegal character in: varcher(30)
{ "type": "record",
"name": "TestRecord",
"fields" : [
{"name": "ACCT_ID", "type": "varchar(30)"},
{"name": "ACCT_NAME", "type": "varchar(30)"}
]
}
This is how PutDatabaseRecord looks like:
Please advise. Appreciate your help! Happy Halloween 🙂
Created 11-01-2018 02:40 PM
FYI, It is better to start a new question, referencing this question, rather than adding a new question via an answer here.
Created 07-29-2020 08:36 PM
hey Guys , quick one. With the infer schema approach now being available , do we need to have the avro registry ?