Member since
03-16-2020
11
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 3713 | 09-07-2023 07:39 PM |
09-07-2023
07:39 PM
@RangaReddy Thanks ,It was server level issue, I tried with different edge not and it worked,
... View more
08-05-2023
05:36 AM
I'm having an issue with a Spark-Hive application running on a Kerberos cluster. I receive a javax.security.sasl.SaslException: GSS initiate failed error, which appears to be caused by not finding any Kerberos tgt. Here's the error log: 23/08/04 22:56:55 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 23/08/04 22:56:55 INFO HiveClientImpl: Attempting to login to Kerberos using principal: hdfs01@HDP.COM and keytab: hdfs01.keytab-2ca1f730-bef7-4166-90ce-67317c75c793 23/08/04 22:56:55 INFO UserGroupInformation: Login successful for user hdfs01@HDP.COM using keytab file hdfs01.keytab-2ca1f730-bef7-4166-90ce-67317c75c793 23/08/04 22:56:55 INFO metastore: Trying to connect to metastore with URI thrift://master3.abc.xyz.com:9083" 23/08/04 22:56:55 ERROR TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apac... I am submitting my Spark job as follows: spark-submit \ --name TestKerberous \ --num-executors 2 \ --driver-java-options "-Djava.security.auth.login.config=./key_fin.conf" \ --driver-java-options "-Dsun.security.krb5.debug=true" \ --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=./key_fin.conf"\ --files=/etc/spark/conf/hive-site.xml,/etc/hadoop/conf/yarn-site.xml,/etc/hadoop/conf/hdfs-site.xml,/etc/hadoop/conf/core-site.xml \ --conf "spark.hadoop.hive.metastore.kerberos.principal=HTTP/_HOST@HDP.COM" \ --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=./key.conf" \ --conf -Djavax.security.auth.useSubjectCredsOnly=false \ --conf spark.executorEnv.KRB5_CONFIG=/etc/krb5.conf \ --conf spark.driverEnv.KRB5_CONFIG=/etc/krb5.conf \ --conf "spark.hadoop.hive.metastore.sasl.enabled=true" \ --conf "spark.hadoop.hive.security.authorization.enabled=true" \ --conf "spark.hadoop.hive.metastore.execute.setugi=true" \ --conf spark.sql.hive.convertMetastoreParquet=false \ --conf spark.home=/usr/hdp/current/spark2-client \ --conf spark.sql.warehouse.dir=/apps/hive/warehouse \ --conf spark.sql.catalogImplementation=hive \ --conf spark.yarn.keytab=/etc/security/keytabs/hdfs01.keytab \ --conf spark.yarn.principal=hdfs01@HDP.COM \ --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \ --master yarn --deploy-mode cluster --driver-cores 2 --driver-memory 2G --executor-cores 2 --executor-memory 2G --supervise \ --class <CLASS_NAME> \ <JAR_FILE>\ "<Hive Jdbc Url>" "thrift://master3.abc.xyz.com:9083" "/apps/hive/warehouse" I would really appreciate it if anyone could help me diagnose what might be going wrong and how to resolve this issue. Thank you in advance for any insights you can provide
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Spark
04-26-2022
08:02 PM
Hi am trying to take simple count of "ScrollElasticsearchHttp" processor in nifi. and using QueryRecord after this processor. I have created one new variable and using below Sql "select count(1) from FLOFILE" I am expecting result.count value 10000 which is my record count but its always showing record.count value 1. can someone suggest how should I take count of this ScrollElasticsearchHttp flow. Thanks !!
... View more
Labels:
- Labels:
-
Apache NiFi
04-01-2022
03:46 AM
Thanks a lot @araujo It Worked. is there any option to handle this with multiline also.As we are getting this files from source in multiline.
... View more
03-31-2022
11:04 PM
I have one Json file and I have validate it, Its a valid json. I am creating Hive table on that file and then select data but its showing Error: java.io.IOException: org.apache.hadoop.hive.serde2.SerDeException: Row is not a valid JSON Object - JSONException: A JSONObject text must end with '}' at 2 [character 3 line 1] (state=,code=0) when I tried to set below property. ALTER TABLE test_rahul SET SERDEPROPERTIES ( "ignore.malformed.json" = "true"); Then It showing Null values for all the fields. Can someone Kindly Help me. json File: { "buyer": { "legalBusinessName": "test1 Company","organisationIdentifications": [{ "type": "abcd", "identification": "test.bb@tesr" }, { "type": "TXID","identification": "12345678" } ] }, "supplier": { "legalBusinessName": "test Company", "organisationIdentifications": [ { "type":"abcd","identification": "test28@test" } ] }, "paymentRecommendationId": "1234-5678-9876-2212-123456", "excludedRemittanceInformation": [], "recommendedPaymentInstructions": [{ "executionDate": "2022-06-12", "paymentMethod": "aaaa", "remittanceInformation": { "structured": [{ "referredDocumentInformation": [{ "type": "xxx", "number": "12341234", "relatedDate": "2022-06-12", "paymentDueDate": "2022-06-12", "referredDocumentAmount": { "remittedAmount": 2600.5, "duePayableAmount": 3000 } }] }] } }] } Jar Added: ADD JAR json-serde-1.3.7-SNAPSHOT-jar-with-dependencies.jar; Create Table: CREATE EXTERNAL TABLE IF NOT EXISTS `test`.`testerde11` (`buyer` STRUCT< `legalBusinessName`:STRING, `organisationIdentifications`:STRUCT< `type`:STRING, `identification`:STRING>>, `supplier` STRUCT< `legalBusinessName`:STRING, `organisationIdentifications`:STRUCT< `type`:STRING, `identification`:STRING>>, `paymentRecommendationId` STRING, `recommendedPaymentInstructions` ARRAY< STRUCT< `executionDate`:STRING, `paymentMethod`:STRING, `remittanceInformation`:STRUCT< `structured`:STRUCT< `referredDocumentInformation`:STRUCT< `type`:STRING, `number`:STRING, `relatedDate`:STRING, `paymentDueDate`:STRING, `referredDocumentAmount`:STRUCT< `remittedAmount`:DOUBLE, `duePayableAmount`:INT>>>>>>) ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe' WITH SERDEPROPERTIES ( "field.delim"=",","mapping.ts" = "number") STORED AS textFILE LOCATION '/user/hdfs/Jsontest/'; Error Message : select * from test.testerde11; Error: java.io.IOException: org.apache.hadoop.hive.serde2.SerDeException: Row is not a valid JSON Object - JSONException: A JSONObject text must end with '}' at 2 [character 3 line 1] (state=,code=0) I tried multiple option but didn't worked .Can someone suggest me what is the issue here/or do I need to change the delimiter/add new properties.
... View more
Labels:
- Labels:
-
Apache Hive
09-22-2021
12:52 AM
Hi Everyone, I have hive partitions folder at HDFS location, but all the partitions folders are in upper case. i.e. YEAR=2021/MONTH=07/DAY=31/HOUR=00 like this. in hive when i am creating table it taking partition columns in lowercase /year=2021/month=07/day=31/hour=00 like this. since hdfs is case sensitive and hive is case insensitive so hive is expecting partitions column in lowercase at hdfs location and I am not able to see any partition in my hive table. so is there any way to handle this case.either hive column in uppercase or Recursively change all hdfs partitions column in lowercase. I have 8000+ partitions (1day=24 hours=30 days=12 months(24*30*12=8640) for 1 year so not able to rename every folder manually. someone kindly suggest.
... View more
Labels:
03-16-2020
05:22 AM
I'm also getting the same issue but for me instead of taking my username.its showing user Anonymous.Can anyone suggest how should i resolve this issue?
... View more