Member since
05-20-2019
17
Posts
0
Kudos Received
0
Solutions
11-10-2021
07:47 AM
Hi All , How can i fund the number of messages published to a kafka topic per hour ? When i got to grafana dashboard from ambari , i could see messages in/s ,does this metric give the number of messages at the time frame in kafka topic ?
... View more
Labels:
08-20-2020
03:45 AM
Spark Job is failing with below error ,i see the below errors in yarn application log
20/08/19 20:38:10 ERROR Executor: Exception in task 1481.0 in stage 60.0 (TID 43008) java.io.IOException: Filesystem closed
ERROR CoarseGrainedExecutorBackend: RECEIVED SIGNAL TERM
Can anyone help me to understand what will the cause of above error in yarn application log
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache YARN
07-07-2020
04:53 AM
Hi All, Can some one help with the document/steps/formula to calculate the datanode heap size when configuring fresh cluster
... View more
- Tags:
- HDFS
Labels:
- Labels:
-
HDFS
06-29-2020
08:25 AM
Hi @Govins , Thanks for your reply ..!! Do you want any particular property that need to check for configuration related to data storage
... View more
06-29-2020
07:48 AM
Hi team , In our cluster in all the datanodes particular disc grid0 is consuming more utilization that other disc like grid1 grid2 grid3 when i put df -h in datanodes datanode1: grid0 conumes 80% grid1 consumes 60% grid2 consumes 63% ..............same upto grid 11 datanode2: grid0 consumes 80% grid consumes 65% .........same upto grid11 why the grid0 alone consuming more utilization?will running balance fix this issue ? Any help is appreciated Thanks
... View more
- Tags:
- HDFS
Labels:
- Labels:
-
HDFS
06-10-2020
06:36 AM
Is there a way ,i can make a hdfs block as corrupted
... View more
Labels:
- Labels:
-
HDFS
05-25-2020
09:53 PM
Hi @Madhur , Thanks for responding ..!! 1) Is the cluster on which you are planning to change password is kerberized ? yes the cluster is kerberized 2) 2) Do you use CM or Ambari for managing your cluster? I use Ambari for Managing the cluster
... View more
05-20-2020
02:17 AM
04-20-2020
06:15 AM
How can i compare the cluster configurations of two HDP clusters using curl API Any help is appreciated Thanks in Advance ...!!!
... View more
Labels:
04-17-2020
04:57 AM
How can i compare two Cloudera CDH clusters ,is there any script available to compare two clusters Thanks in Advance
... View more
Labels:
- Labels:
-
Cloudera Manager
04-16-2020
06:07 AM
@stevenmatison Thanks for the Update The commend you mention will change the owner and group for the Path we are giving It will not change the owner and group for all the files under the directory I am looking for the curl to change the owner and group permissions recursively
... View more
04-15-2020
09:33 PM
What will be the Equivalent Curl commend for hdfs dfs -chown <owner> -R <path> I
... View more
- Tags:
- Curl command
- HDFS
Labels:
- Labels:
-
HDFS
04-13-2020
09:50 PM
Hi @jsensharma , Thanks for your comments .!! I believe i am using lower version of hadoop that is why the reason i am facing the issue Hadoop 2.7.3.2.6.5.0-292 Subversion git@github.com:hortonworks/hadoop.git -r 3091053c59a62c82d82c9f778c48bde5ef0a89a1 Compiled by jenkins on 2018-05-11T07:53Z Compiled with protoc 2.5.0 From source with checksum abed71da5bc89062f6f6711179f2058 This command was run using /usr/hdp/2.6.5.0-292/hadoop/hadoop-common-2.7.3.2.6.5.0-292.jar #javap -cp /usr/hdp/2.6.5.0-292/hadoop-hdfs//hadoop-hdfs-2.7.3.2.6.5.0-292.jar org.apache.hadoop.hdfs.web.resources.PutOpParam.Op Compiled from "PutOpParam.java" public final class org.apache.hadoop.hdfs.web.resources.PutOpParam$Op extends java.lang.Enum<org.apache.hadoop.hdfs.web.resources.PutOpParam$Op> implements org.apache.hadoop.hdfs.web.resources.HttpOpParam$Op { public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CREATE; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op MKDIRS; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CREATESYMLINK; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op RENAME; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETREPLICATION; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETOWNER; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETPERMISSION; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETTIMES; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op RENEWDELEGATIONTOKEN; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CANCELDELEGATIONTOKEN; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op MODIFYACLENTRIES; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEACLENTRIES; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEDEFAULTACL; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEACL; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETACL; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETXATTR; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEXATTR; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CREATESNAPSHOT; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op RENAMESNAPSHOT; public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op NULL; final boolean doOutputAndRedirect; final int expectedHttpResponseCode; final boolean requireAuth; public static org.apache.hadoop.hdfs.web.resources.PutOpParam$Op[] values(); public static org.apache.hadoop.hdfs.web.resources.PutOpParam$Op valueOf(java.lang.String); public org.apache.hadoop.hdfs.web.resources.HttpOpParam$Type getType(); public boolean getRequireAuth(); public boolean getDoOutput(); public boolean getRedirect(); public int getExpectedHttpResponseCode(); public java.lang.String toQueryString(); static {}; }
... View more
04-13-2020
08:36 PM
Hi All , I am trying to make a directory as a snapshot directory using curl API ,when i run the curl for Allowsnapshot it throws below error Any help is Appreciated.. Thanks in Advance ..!! curl -i -X PUT "http://internal:50070/webhdfs/v1/dummy/?op=DISALLOWSNAPSHOT" HTTP/1.1 400 Bad Request Cache-Control: no-cache Expires: Tue, 14 Apr 2020 03:26:30 GMT Date: Tue, 14 Apr 2020 03:26:30 GMT Pragma: no-cache Expires: Tue, 14 Apr 2020 03:26:30 GMT Date: Tue, 14 Apr 2020 03:26:30 GMT Pragma: no-cache Content-Type: application/json X-FRAME-OPTIONS: SAMEORIGIN Transfer-Encoding: chunked Server: Jetty(6.1.26.hwx) {"RemoteException":{"exception":"IllegalArgumentException","javaClassName":"java.lang.IllegalArgumentException","message":"Invalid value for webhdfs parameter \"op\": No enum constant org.apache.hadoop.hdfs.web.resources.PutOpParam.Op.DISALLOWSNAPSHOT"}}[
... View more
Labels:
- Labels:
-
HDFS