Member since
06-07-2016
923
Posts
322
Kudos Received
115
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3286 | 10-18-2017 10:19 PM | |
3645 | 10-18-2017 09:51 PM | |
13324 | 09-21-2017 01:35 PM | |
1350 | 08-04-2017 02:00 PM | |
1734 | 07-31-2017 03:02 PM |
11-15-2023
09:55 AM
Hola a todos, Actualice los paquetes de krb5-* a su ultima versión y se solucionó el problema. saludos.
... View more
10-11-2023
09:46 AM
@Srinivascnu As this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post. Thanks.
... View more
08-10-2023
09:48 PM
Try this option: [serviceaccount@edgenode ~]$ hdfs getconf -confKey dfs.nameservices hadoopcdhnn [serviceaccount@edgenode ~]$ hdfs getconf -confKey dfs.ha.namenodes.hadoopcdhnn namenode5605,namenode5456 [serviceaccount@edgenode ~]$ hdfs haadmin -getServiceState namenode5605 active [serviceaccount@edgenode ~]$ hdfs haadmin -getServiceState namenode5456 standby
... View more
07-27-2023
07:42 AM
is this solution is fit in streaming more than puthive3ql for about 10 GB during the day???
... View more
11-20-2022
09:33 PM
@Ishi as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.
... View more
06-14-2022
12:59 AM
Hi @Tremo, as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.
... View more
02-27-2022
11:00 PM
1 Kudo
This solution is works for me on HDP 3.1.4 Ambari 2.7 Thanks for sharing.
... View more
06-09-2021
09:25 AM
I had a similar issue. You may want to check Ranger > Audit > Plugin Status to see if the policy is being applied. If it's not it may be that you have a jersey-client classpath conflict. More details here: https://jonmorisissqlblog.blogspot.com/2021/06/ranger-hive-policy-activation-time.html
... View more
03-03-2021
09:51 PM
I have a tab separated file like this Copyright details 2021 ID \t NUMBER \t ADDRESS \t ZIPCODE 10 \t 9877777777 \t India \t 400322 13 \t 2983359852 \t AUS \t 84534 26 \t 7832724424 34 \t 8238444294 \t RSA \t 74363 Here the first row is a comment and the row with ID 26 doesn't have ending columns values. Even it doesn't have \t at the end . So I need to read file skipping first line and handle missing delimiters at end. I tried this import org.apache.spark.sql.DataFrame val sqlContext = new org.apache.spark.sql.SQLContext(sc) import sqlContext.implicits._ val data = sc.textFile("sample_badtsv.txt") val comments = data.first() val fdata = data.filter(x => x != comments) val header = fdata.filter(x => x.split("\t")(1) == "NUMBER").collect().mkString val df = fdata.filter(x => x.split("\t")(1) != "NUMBER") .map(x => x.split("\t")) .map(x => (x(0),x(1),x(2),x(3))) .toDF(header.split("\t"): _*) Since I have missing \t at the end of lines if empty, I am getting ArrayIndexoutofBoundsException. because when converting the rdd to dataframe we have less records for some rows. Please provide me a better solution so that I can skip first line and read the file correctly (even there are no \t the code needs to consider it as NULL values at the end like below) ID NUMBER ADDRESS ZIPCODE 10 9877777777 India 400322 13 2983359852 AUS 84534 26 7832724424 NULL NULL 34 8238444294 RSA 74363
... View more
11-11-2020
01:20 AM
You can try this ${message:unescapeXml()} This function unescapes a string containing XML entity escapes to a string containing the actual Unicode characters corresponding to the escapes.
... View more