Member since
05-20-2019
17
Posts
0
Kudos Received
0
Solutions
11-10-2021
07:47 AM
Hi All , How can i fund the number of messages published to a kafka topic per hour ? When i got to grafana dashboard from ambari , i could see messages in/s ,does this metric give the number of messages at the time frame in kafka topic ?
... View more
Labels:
08-20-2020
03:45 AM
Spark Job is failing with below error ,i see the below errors in yarn application log
20/08/19 20:38:10 ERROR Executor: Exception in task 1481.0 in stage 60.0 (TID 43008) java.io.IOException: Filesystem closed
ERROR CoarseGrainedExecutorBackend: RECEIVED SIGNAL TERM
Can anyone help me to understand what will the cause of above error in yarn application log
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache YARN
07-07-2020
04:53 AM
Hi All, Can some one help with the document/steps/formula to calculate the datanode heap size when configuring fresh cluster
... View more
Labels:
- Labels:
-
HDFS
05-25-2020
09:53 PM
Hi @Madhur , Thanks for responding ..!! 1) Is the cluster on which you are planning to change password is kerberized ? yes the cluster is kerberized 2) 2) Do you use CM or Ambari for managing your cluster? I use Ambari for Managing the cluster
... View more
05-20-2020
02:17 AM
if i change the passoword for hdfs service account ,will if affect any of the services like hdfs,yarn,hbase,hive...
... View more
Labels:
- Labels:
-
HDFS
04-13-2020
09:50 PM
Hi @jsensharma , Thanks for your comments .!! I believe i am using lower version of hadoop that is why the reason i am facing the issue Hadoop 2.7.3.2.6.5.0-292 Subversion git@github.com:hortonworks/hadoop.git -r 3091053c59a62c82d82c9f778c48bde5ef0a89a1 Compiled by jenkins on 2018-05-11T07:53Z Compiled with protoc 2.5.0 From source with checksum abed71da5bc89062f6f6711179f2058 This command was run using /usr/hdp/2.6.5.0-292/hadoop/hadoop-common-2.7.3.2.6.5.0-292.jar #javap -cp /usr/hdp/2.6.5.0-292/hadoop-hdfs//hadoop-hdfs-2.7.3.2.6.5.0-292.jar org.apache.hadoop.hdfs.web.resources.PutOpParam.Op Compiled from "PutOpParam.java" public final class org.apache.hadoop.hdfs.web.resources.PutOpParam$Op extends java.lang.Enum<org.apache.hadoop.hdfs.web.resources.PutOpParam$Op> implements org.apache.hadoop.hdfs.web.resources.HttpOpParam$Op { public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CREATE; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op MKDIRS; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CREATESYMLINK; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op RENAME; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETREPLICATION; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETOWNER; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETPERMISSION; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETTIMES; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op RENEWDELEGATIONTOKEN; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CANCELDELEGATIONTOKEN; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op MODIFYACLENTRIES; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEACLENTRIES; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEDEFAULTACL; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEACL; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETACL; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETXATTR; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEXATTR; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CREATESNAPSHOT; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op RENAMESNAPSHOT; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op NULL; final boolean doOutputAndRedirect; final int expectedHttpResponseCode; final boolean requireAuth; public static org.apache.hadoop.hdfs.web.resources.PutOpParam$Op[] values(); public static org.apache.hadoop.hdfs.web.resources.PutOpParam$Op valueOf(java.lang.String); public org.apache.hadoop.hdfs.web.resources.HttpOpParam$Type getType(); public boolean getRequireAuth(); public boolean getDoOutput(); public boolean getRedirect(); public int getExpectedHttpResponseCode(); public java.lang.String toQueryString(); static {}; }
... View more
04-13-2020
08:36 PM
Hi All , I am trying to make a directory as a snapshot directory using curl API ,when i run the curl for Allowsnapshot it throws below error Any help is Appreciated.. Thanks in Advance ..!! curl -i -X PUT "http://internal:50070/webhdfs/v1/dummy/?op=DISALLOWSNAPSHOT" HTTP/1.1 400 Bad Request Cache-Control: no-cache Expires: Tue, 14 Apr 2020 03:26:30 GMT Date: Tue, 14 Apr 2020 03:26:30 GMT Pragma: no-cache Expires: Tue, 14 Apr 2020 03:26:30 GMT Date: Tue, 14 Apr 2020 03:26:30 GMT Pragma: no-cache Content-Type: application/json X-FRAME-OPTIONS: SAMEORIGIN Transfer-Encoding: chunked Server: Jetty(6.1.26.hwx) {"RemoteException":{"exception":"IllegalArgumentException","javaClassName":"java.lang.IllegalArgumentException","message":"Invalid value for webhdfs parameter \"op\": No enum constant org.apache.hadoop.hdfs.web.resources.PutOpParam.Op.DISALLOWSNAPSHOT"}}[
... View more
Labels:
- Labels:
-
HDFS