Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Curl throws error when running allow snapshot

avatar
Explorer

Hi All ,

 

I am trying to make a directory as a snapshot directory using curl API ,when i run the curl for Allowsnapshot it throws below error 

 

Any help is Appreciated..

Thanks in Advance ..!!

 

 

curl -i -X PUT "http://internal:50070/webhdfs/v1/dummy/?op=DISALLOWSNAPSHOT"
HTTP/1.1 400 Bad Request
Cache-Control: no-cache
Expires: Tue, 14 Apr 2020 03:26:30 GMT
Date: Tue, 14 Apr 2020 03:26:30 GMT
Pragma: no-cache
Expires: Tue, 14 Apr 2020 03:26:30 GMT
Date: Tue, 14 Apr 2020 03:26:30 GMT
Pragma: no-cache
Content-Type: application/json
X-FRAME-OPTIONS: SAMEORIGIN
Transfer-Encoding: chunked
Server: Jetty(6.1.26.hwx)

{"RemoteException":{"exception":"IllegalArgumentException","javaClassName":"java.lang.IllegalArgumentException","message":"Invalid value for webhdfs parameter \"op\": No enum constant org.apache.hadoop.hdfs.web.resources.PutOpParam.Op.DISALLOWSNAPSHOT"}}[

1 ACCEPTED SOLUTION

avatar
Master Mentor

@sarm 
What is your HDFS version? Is it Hadoop 2.8.0, 3.0.0-alpha1 or higher?

 

 

# hadoop version

 

 

Quick check on what the JAR contains?

# javap -cp /usr/hdp/3.1.0.0-78/hadoop/client/hadoop-hdfs-client.jar org.apache.hadoop.hdfs.web.resources.PutOpParam.Op | grep -i ALLOW

  public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op ALLOWSNAPSHOT;
  public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op DISALLOWSNAPSHOT;

 

For example i am able to use the same WebHDFS API call without any issue as following:

 

# curl -i -X PUT "http://kerlatest1.example.com:50070/webhdfs/v1/tmp/aaaa_bbbb?op=DISALLOWSNAPSHOT&user.name=hdfs"

HTTP/1.1 200 OK
Date: Tue, 14 Apr 2020 03:45:24 GMT
Cache-Control: no-cache
Expires: Tue, 14 Apr 2020 03:45:24 GMT
Date: Tue, 14 Apr 2020 03:45:24 GMT
Pragma: no-cache
X-FRAME-OPTIONS: SAMEORIGIN
Set-Cookie: hadoop.auth="u=hdfs&p=hdfs&t=simple&e=1586871924286&s=xxxxxxxx/yyyyyyyyy="; Path=/; HttpOnly
Content-Type: application/octet-stream
Content-Length: 0

 

 


Please refer to the following JIRA to verify if you are using the correct version (like
2.8.0, 3.0.0-alpha1 or higher) of HDFS where this option is available?

Reference:
https://issues.apache.org/jira/browse/HDFS-9057
https://cwiki.apache.org/confluence/display/HADOOP/Hadoop+2.8.0+Release (look for HDFS-9057)

View solution in original post

2 REPLIES 2

avatar
Master Mentor

@sarm 
What is your HDFS version? Is it Hadoop 2.8.0, 3.0.0-alpha1 or higher?

 

 

# hadoop version

 

 

Quick check on what the JAR contains?

# javap -cp /usr/hdp/3.1.0.0-78/hadoop/client/hadoop-hdfs-client.jar org.apache.hadoop.hdfs.web.resources.PutOpParam.Op | grep -i ALLOW

  public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op ALLOWSNAPSHOT;
  public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op DISALLOWSNAPSHOT;

 

For example i am able to use the same WebHDFS API call without any issue as following:

 

# curl -i -X PUT "http://kerlatest1.example.com:50070/webhdfs/v1/tmp/aaaa_bbbb?op=DISALLOWSNAPSHOT&user.name=hdfs"

HTTP/1.1 200 OK
Date: Tue, 14 Apr 2020 03:45:24 GMT
Cache-Control: no-cache
Expires: Tue, 14 Apr 2020 03:45:24 GMT
Date: Tue, 14 Apr 2020 03:45:24 GMT
Pragma: no-cache
X-FRAME-OPTIONS: SAMEORIGIN
Set-Cookie: hadoop.auth="u=hdfs&p=hdfs&t=simple&e=1586871924286&s=xxxxxxxx/yyyyyyyyy="; Path=/; HttpOnly
Content-Type: application/octet-stream
Content-Length: 0

 

 


Please refer to the following JIRA to verify if you are using the correct version (like
2.8.0, 3.0.0-alpha1 or higher) of HDFS where this option is available?

Reference:
https://issues.apache.org/jira/browse/HDFS-9057
https://cwiki.apache.org/confluence/display/HADOOP/Hadoop+2.8.0+Release (look for HDFS-9057)

avatar
Explorer

Hi @jsensharma  ,

 

Thanks for your comments .!!

 

I believe i am using lower version of hadoop that is why the reason i am facing the issue 

 

Hadoop 2.7.3.2.6.5.0-292
Subversion git@github.com:hortonworks/hadoop.git -r 3091053c59a62c82d82c9f778c48bde5ef0a89a1
Compiled by jenkins on 2018-05-11T07:53Z
Compiled with protoc 2.5.0
From source with checksum abed71da5bc89062f6f6711179f2058
This command was run using /usr/hdp/2.6.5.0-292/hadoop/hadoop-common-2.7.3.2.6.5.0-292.jar

 

#javap -cp /usr/hdp/2.6.5.0-292/hadoop-hdfs//hadoop-hdfs-2.7.3.2.6.5.0-292.jar org.apache.hadoop.hdfs.web.resources.PutOpParam.Op
Compiled from "PutOpParam.java"
public final class org.apache.hadoop.hdfs.web.resources.PutOpParam$Op extends java.lang.Enum<org.apache.hadoop.hdfs.web.resources.PutOpParam$Op> implements org.apache.hadoop.hdfs.web.resources.HttpOpParam$Op {
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CREATE;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op MKDIRS;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CREATESYMLINK;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op RENAME;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETREPLICATION;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETOWNER;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETPERMISSION;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETTIMES;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op RENEWDELEGATIONTOKEN;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CANCELDELEGATIONTOKEN;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op MODIFYACLENTRIES;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEACLENTRIES;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEDEFAULTACL;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEACL;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETACL;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op SETXATTR;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op REMOVEXATTR;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op CREATESNAPSHOT;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op RENAMESNAPSHOT;
public static final org.apache.hadoop.hdfs.web.resources.PutOpParam$Op NULL;
final boolean doOutputAndRedirect;
final int expectedHttpResponseCode;
final boolean requireAuth;
public static org.apache.hadoop.hdfs.web.resources.PutOpParam$Op[] values();
public static org.apache.hadoop.hdfs.web.resources.PutOpParam$Op valueOf(java.lang.String);
public org.apache.hadoop.hdfs.web.resources.HttpOpParam$Type getType();
public boolean getRequireAuth();
public boolean getDoOutput();
public boolean getRedirect();
public int getExpectedHttpResponseCode();
public java.lang.String toQueryString();
static {};
}