Member since
03-16-2020
8
Posts
0
Kudos Received
0
Solutions
04-26-2022
08:02 PM
Hi am trying to take simple count of " ScrollElasticsearchHttp" processor in nifi. and using QueryRecord after this processor. I have created one new variable and using below Sql "select count(1) from FLOFILE" I am expecting result.count value 10000 which is my record count but its always showing record.count value 1. can someone suggest how should I take count of this ScrollElasticsearchHttp flow. Thanks !!
... View more
Labels:
- Labels:
-
Apache NiFi
04-01-2022
03:46 AM
Thanks a lot @araujo It Worked. is there any option to handle this with multiline also.As we are getting this files from source in multiline.
... View more
03-31-2022
11:04 PM
I have one Json file and I have validate it, Its a valid json. I am creating Hive table on that file and then select data but its showing Error: java.io.IOException: org.apache.hadoop.hive.serde2.SerDeException: Row is not a valid JSON Object - JSONException: A JSONObject text must end with '}' at 2 [character 3 line 1] (state=,code=0) when I tried to set below property. ALTER TABLE test_rahul SET SERDEPROPERTIES ( "ignore.malformed.json" = "true"); Then It showing Null values for all the fields. Can someone Kindly Help me. json File: { "buyer": { "legalBusinessName": "test1 Company","organisationIdentifications": [{ "type": "abcd", "identification": "test.bb@tesr" }, { "type": "TXID","identification": "12345678" } ] }, "supplier": { "legalBusinessName": "test Company", "organisationIdentifications": [ { "type":"abcd","identification": "test28@test" } ] }, "paymentRecommendationId": "1234-5678-9876-2212-123456", "excludedRemittanceInformation": [], "recommendedPaymentInstructions": [{ "executionDate": "2022-06-12", "paymentMethod": "aaaa", "remittanceInformation": { "structured": [{ "referredDocumentInformation": [{ "type": "xxx", "number": "12341234", "relatedDate": "2022-06-12", "paymentDueDate": "2022-06-12", "referredDocumentAmount": { "remittedAmount": 2600.5, "duePayableAmount": 3000 } }] }] } }] } Jar Added: ADD JAR json-serde-1.3.7-SNAPSHOT-jar-with-dependencies.jar; Create Table: CREATE EXTERNAL TABLE IF NOT EXISTS `test`.`testerde11` (`buyer` STRUCT< `legalBusinessName`:STRING, `organisationIdentifications`:STRUCT< `type`:STRING, `identification`:STRING>>, `supplier` STRUCT< `legalBusinessName`:STRING, `organisationIdentifications`:STRUCT< `type`:STRING, `identification`:STRING>>, `paymentRecommendationId` STRING, `recommendedPaymentInstructions` ARRAY< STRUCT< `executionDate`:STRING, `paymentMethod`:STRING, `remittanceInformation`:STRUCT< `structured`:STRUCT< `referredDocumentInformation`:STRUCT< `type`:STRING, `number`:STRING, `relatedDate`:STRING, `paymentDueDate`:STRING, `referredDocumentAmount`:STRUCT< `remittedAmount`:DOUBLE, `duePayableAmount`:INT>>>>>>) ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe' WITH SERDEPROPERTIES ( "field.delim"=",","mapping.ts" = "number") STORED AS textFILE LOCATION '/user/hdfs/Jsontest/'; Error Message : select * from test.testerde11; Error: java.io.IOException: org.apache.hadoop.hive.serde2.SerDeException: Row is not a valid JSON Object - JSONException: A JSONObject text must end with '}' at 2 [character 3 line 1] (state=,code=0) I tried multiple option but didn't worked .Can someone suggest me what is the issue here/or do I need to change the delimiter/add new properties.
... View more
Labels:
- Labels:
-
Apache Hive
09-22-2021
12:52 AM
Hi Everyone, I have hive partitions folder at HDFS location, but all the partitions folders are in upper case. i.e. YEAR=2021/MONTH=07/DAY=31/HOUR=00 like this. in hive when i am creating table it taking partition columns in lowercase /year=2021/month=07/day=31/hour=00 like this. since hdfs is case sensitive and hive is case insensitive so hive is expecting partitions column in lowercase at hdfs location and I am not able to see any partition in my hive table. so is there any way to handle this case.either hive column in uppercase or Recursively change all hdfs partitions column in lowercase. I have 8000+ partitions (1day=24 hours=30 days=12 months(24*30*12=8640) for 1 year so not able to rename every folder manually. someone kindly suggest.
... View more
Labels:
03-16-2020
05:22 AM
I'm also getting the same issue but for me instead of taking my username.its showing user Anonymous.Can anyone suggest how should i resolve this issue?
... View more