Member since
02-12-2020
40
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4339 | 03-09-2020 05:24 AM |
03-16-2020
03:24 AM
Hi Team,
We are unable to start Hbase.
We are using Hadoop 3.0.1 on Isilon One FS
Error Message :
Creating: Resource [source=null, target=/apps/hbase/data, type=directory, action=create, owner=hbase-xxxxx, group=null, mode=null, recursiveChown=false, recursiveChmod=false, changePermissionforParents=false, manageIfExists=true] in default filesystem
Exception occurred, Reason: Unexpected error: status: STATUS_MEDIA_WRITE_PROTECTED = 0xC00000A2 with path="/apps/hbase/data", username=hbase-xxxxx, groupname=
org.apache.hadoop.ipc.RemoteException(java.io.IOException): Unexpected error: status: STATUS_MEDIA_WRITE_PROTECTED = 0xC00000A2 with path="/apps/hbase/data", username=hbase-xxxxx, groupname=
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1497)
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HBase
03-16-2020
03:18 AM
Hi @stevenmatison, We have another cluster with similar user where we are not facing any issue. Also, we have not set up any Ranger policies yet. Could you pls advise ?
... View more
03-12-2020
03:47 AM
Thanks for sharing documents @StevenOD
... View more
03-11-2020
07:07 PM
Thanks @steve Could you pls suggest native tools in this scenario ? Thanks & Regards
... View more
03-09-2020
08:15 AM
Hi venkatsambath, Thanks for the response. Will check permissions for hive metastore. 2020-03-06T09:14:36,393 INFO [hiveServer2.async.multi_dest.batch_hiveServer2.async.multi_des HDFS audit. Event Size:1 2020-03-06T09:14:36,393 ERROR [hiveServer2.async.multi_dest.batch_hiveServer2.async.multi_deso consumer. provider=hiveServer2.async.multi_dest.batch, consumer=hiveServer2.async.multi_des 2020-03-06T09:14:36,393 INFO [hiveServer2.async.multi_dest.batch_hiveServer2.async.multi_des sleeping for 30000 milli seconds. indexQueue=0, queueName=hiveServer2.async.multi_dest.batch 2020-03-06T09:15:36,394 INFO [hiveServer2.async.multi_dest.batch_hiveServer2.async.multi_desg: name=hiveServer2.async.multi_dest.batch.hdfs, interval=01:00.015 minutes, events=1, deferr 2020-03-06T09:15:36,394 INFO [hiveServer2.async.multi_dest.batch_hiveServer2.async.multi_desg HDFS Filesystem Config: Configuration: core-default.xml, core-site.xml, mapred-default.xml,ite.xml 2020-03-06T09:15:36,412 INFO [hiveServer2.async.multi_dest.batch_hiveServer2.async.multi_des whether log file exists. hdfPath=hdfs://localhost:xxxx/hiveServer2/xxxxx/hiveServer2_rang 2020-03-06T09:15:36,413 ERROR [hiveServer2.async.multi_dest.batch_hiveServer2.async.multi_deso log file. java.net.ConnectException: Call From XXXXXX.XXXXX.XXX/10.21.16.60 to localhost:8020 ed; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.GeneratedConstructorAccessor61.newInstance(Unknown Source) ~[?:?] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAcc at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_181] at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831) ~[hadoop-common- at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:755) ~[hadoop-common-3. at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1501) ~[hadoop-common-3.1. at org.apache.hadoop.ipc.Client.call(Client.java:1443) ~[hadoop-common-3.1.1.3.0.1.0- at org.apache.hadoop.ipc.Client.call(Client.java:1353) ~[hadoop-common-3.1.1.3.0.1.0- at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.$Proxy32.getFileInfo(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(C1.0-187.jar:?] at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) ~[?:?] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java: at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHand at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocatio at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandl at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationH at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.ja at com.sun.proxy.$Proxy33.getFileInfo(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1654) ~[hadoop-hdfs-cl at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java: at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java: at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81 at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.j at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1734) ~[hadoop-common-3.1.1 at org.apache.ranger.audit.destination.HDFSAuditDestination.getLogFileStream(HDFSAudi at org.apache.ranger.audit.destination.HDFSAuditDestination.access$000(HDFSAuditDesti at org.apache.ranger.audit.destination.HDFSAuditDestination$1.run(HDFSAuditDestinatio at org.apache.ranger.audit.destination.HDFSAuditDestination$1.run(HDFSAuditDestinatio at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181] at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:173 at org.apache.ranger.audit.provider.MiscUtil.executePrivilegedAction(MiscUtil.java:52 at org.apache.ranger.audit.destination.HDFSAuditDestination.logJSON(HDFSAuditDestinat at org.apache.ranger.audit.queue.AuditFileSpool.sendEvent(AuditFileSpool.java:879) ~[ at org.apache.ranger.audit.queue.AuditFileSpool.runLogAudit(AuditFileSpool.java:827) at org.apache.ranger.audit.queue.AuditFileSpool.run(AuditFileSpool.java:757) ~[?:?] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181] Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:1.8.0_181] at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[?:1.8.0_1 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[ at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) ~[hadoop-common-3.1.1.3. at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:687) ~[hadoop- at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:790) ~[hadoop-c at org.apache.hadoop.ipc.Client$Connection.access$3600(Client.java:410) ~[hadoop-comm at org.apache.hadoop.ipc.Client.getConnection(Client.java:1558) ~[hadoop-common-3.1.1 at org.apache.hadoop.ipc.Client.call(Client.java:1389) ~[hadoop-common-3.1.1.3.0.1.0-
... View more
03-09-2020
05:28 AM
Hi @StevenOD, Thanks for the details. Just one query, we are building Hadoop on top of Isilon , will the following still holds true in that case ? Thanks & Regards Arvind.
... View more
03-09-2020
05:24 AM
@Gomathinayagam Thanks for your prompt response & clarification !.
... View more
03-09-2020
01:37 AM
Hi,
Need help on below query.
We have a scenario where for any column defined with datatype as string in Hadoop, NULL value is loaded as blank. However, any column other than string datatype (INT,DOUBLE etc) NULL values are loaded as NULL in Hadoop.
Column name Data Type
service1end string
service1start string
service2end string
service2start string
firstlinemaintcost double
Is this a default behavior of Hadoop/hive?
... View more
Labels:
- Labels:
-
Apache Hive
03-05-2020
12:53 PM
Hi,
I'm trying to execute Hive (select ) query on an external table through beeline & getting below error
:Error while compiling statement: FAILED: NullPointerException null (state=42000,code=40000).
Appreciate if any clue on this.
Thanks !
... View more
Labels:
- Labels:
-
Apache Hive
- « Previous
- Next »