Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Problem with Exporting a Hbase Snapshot to Another Cluster

Problem with Exporting a Hbase Snapshot to Another Cluster

New Contributor

Hi guys
I have small cloudera cluster CDH 4.5 version on centos 6 based OSes; and also I have second smaller testing cluster with latest uptodate CDH 5.x.x version on ubuntu 14.04. I have enabled hbase snapshots on both and trying to export some snapshot from CDH 4.5 to CDH5 cluster, but I get errors:

Spoiler
-bash-4.1$ hbase org.apache.hadoop.hbase.snapshot.ExportSnapshot -snapshot snapshotOFanprs -copy-to hdfs://CDH5:8020/hbase
Exception in thread "main" java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "CDH4"; destination host is: "CDH5":8020;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
at org.apache.hadoop.ipc.Client.call(Client.java:1242)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at $Proxy10.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at $Proxy10.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:629)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1545)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:820)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1378)
at org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:620)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:707)
at org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:711)
Caused by: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status
at com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:949)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:847)

 I tryed change port number to 8022 - hdfs://CDH5:8022/hbase but same error

 

I have searched for solution in the internet but cant find, if you have any idea plz help
Thank you anyway

Don't have an account?
Coming from Hortonworks? Activate your account here