- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
org.apache.hadoop.hive.serde2.SerDeException: Row is not a valid JSON Object
- Labels:
-
Apache Hive
Created ‎03-31-2022 11:04 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have one Json file and I have validate it, Its a valid json.
I am creating Hive table on that file and then select data but its showing
Error: java.io.IOException:
org.apache.hadoop.hive.serde2.SerDeException: Row is not a valid JSON Object
- JSONException: A JSONObject text must end with '}' at 2 [character 3 line 1]
(state=,code=0)
when I tried to set below property.
ALTER TABLE test_rahul SET SERDEPROPERTIES ( "ignore.malformed.json" = "true");
Then It showing Null values for all the fields. Can someone Kindly Help me.
json File:
{
"buyer": {
"legalBusinessName": "test1 Company","organisationIdentifications": [{ "type": "abcd",
"identification": "test.bb@tesr"
},
{
"type": "TXID","identification": "12345678"
}
]
},
"supplier": {
"legalBusinessName": "test Company",
"organisationIdentifications": [
{
"type":"abcd","identification": "test28@test"
}
]
},
"paymentRecommendationId": "1234-5678-9876-2212-123456",
"excludedRemittanceInformation": [],
"recommendedPaymentInstructions": [{
"executionDate": "2022-06-12",
"paymentMethod": "aaaa",
"remittanceInformation": {
"structured": [{
"referredDocumentInformation": [{
"type": "xxx",
"number": "12341234",
"relatedDate": "2022-06-12",
"paymentDueDate": "2022-06-12",
"referredDocumentAmount": {
"remittedAmount": 2600.5,
"duePayableAmount": 3000
}
}]
}]
}
}]
}
Jar Added:
ADD JAR json-serde-1.3.7-SNAPSHOT-jar-with-dependencies.jar;
Create Table:
CREATE EXTERNAL TABLE IF NOT EXISTS `test`.`testerde11`
(`buyer` STRUCT< `legalBusinessName`:STRING, `organisationIdentifications`:STRUCT< `type`:STRING, `identification`:STRING>>,
`supplier` STRUCT< `legalBusinessName`:STRING, `organisationIdentifications`:STRUCT< `type`:STRING, `identification`:STRING>>,
`paymentRecommendationId` STRING, `recommendedPaymentInstructions` ARRAY< STRUCT< `executionDate`:STRING, `paymentMethod`:STRING,
`remittanceInformation`:STRUCT< `structured`:STRUCT< `referredDocumentInformation`:STRUCT< `type`:STRING,
`number`:STRING, `relatedDate`:STRING, `paymentDueDate`:STRING, `referredDocumentAmount`:STRUCT< `remittedAmount`:DOUBLE,
`duePayableAmount`:INT>>>>>>)
ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
WITH SERDEPROPERTIES ( "field.delim"=",","mapping.ts" = "number")
STORED AS textFILE LOCATION '/user/hdfs/Jsontest/';
Error Message :
select * from test.testerde11;
Error: java.io.IOException: org.apache.hadoop.hive.serde2.SerDeException:
Row is not a valid JSON Object - JSONException: A JSONObject text must end with '}' at 2 [character 3 line 1] (state=,code=0)
I tried multiple option but didn't worked .Can someone suggest me what is the issue here/or do I need to change the delimiter/add new properties.
Created ‎04-01-2022 03:13 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Rohan44 ,
Try creating your datafile with each JSON record in a single line, like so:
{"buyer":{"legalBusinessName":"test1 Company","organisationIdentifications":[{"type":"abcd","identification":"test.bb@tesr"},{"type":"TXID","identification":"12345678"}]},"supplier":{"legalBusinessName":"test Company","organisationIdentifications":[{"type":"abcd","identification":"test28@test"}]},"paymentRecommendationId":"1234-5678-9876-2212-123456","excludedRemittanceInformation":[],"recommendedPaymentInstructions":[{"executionDate":"2022-06-12","paymentMethod":"aaaa","remittanceInformation":{"structured":[{"referredDocumentInformation":[{"type":"xxx","number":"12341234","relatedDate":"2022-06-12","paymentDueDate":"2022-06-12","referredDocumentAmount":{"remittedAmount":2600.5,"duePayableAmount":3000}}]}]}}]}
Cheers,
André
Was your question answered? Please take some time to click on "Accept as Solution" below this post.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created ‎04-01-2022 07:28 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Rohan44 ,
Unfortunately, I don't think there's a way to do that.
What are you using to write the files to HDFS?
Cheers,
André
Was your question answered? Please take some time to click on "Accept as Solution" below this post.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created ‎04-01-2022 03:13 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Rohan44 ,
Try creating your datafile with each JSON record in a single line, like so:
{"buyer":{"legalBusinessName":"test1 Company","organisationIdentifications":[{"type":"abcd","identification":"test.bb@tesr"},{"type":"TXID","identification":"12345678"}]},"supplier":{"legalBusinessName":"test Company","organisationIdentifications":[{"type":"abcd","identification":"test28@test"}]},"paymentRecommendationId":"1234-5678-9876-2212-123456","excludedRemittanceInformation":[],"recommendedPaymentInstructions":[{"executionDate":"2022-06-12","paymentMethod":"aaaa","remittanceInformation":{"structured":[{"referredDocumentInformation":[{"type":"xxx","number":"12341234","relatedDate":"2022-06-12","paymentDueDate":"2022-06-12","referredDocumentAmount":{"remittedAmount":2600.5,"duePayableAmount":3000}}]}]}}]}
Cheers,
André
Was your question answered? Please take some time to click on "Accept as Solution" below this post.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created ‎04-01-2022 03:46 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks a lot @araujo It Worked.
is there any option to handle this with multiline also.As we are getting this files from source in multiline.
Created ‎04-01-2022 07:28 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Rohan44 ,
Unfortunately, I don't think there's a way to do that.
What are you using to write the files to HDFS?
Cheers,
André
Was your question answered? Please take some time to click on "Accept as Solution" below this post.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created ‎04-26-2022 08:05 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes HDFS
Created ‎04-26-2022 08:05 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes HDFS
