Member since
05-30-2018
1322
Posts
715
Kudos Received
148
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 4067 | 08-20-2018 08:26 PM | |
| 1962 | 08-15-2018 01:59 PM | |
| 2390 | 08-13-2018 02:20 PM | |
| 4138 | 07-23-2018 04:37 PM | |
| 5045 | 07-19-2018 12:52 PM |
11-29-2016
05:16 AM
@satya s I am not sure i completely follow your question but here is a try. I think you are trying to load a orc table with a non orc file. you should either convert the text file to orc, or create external hive table using format of the text file and then load into orc table using select into statement.
... View more
11-28-2016
04:22 PM
2 Kudos
You can remove the log files but I would recommend much easier way to have this automated. Most services in hadoop user log4j. Simply enable RollingFileAppender and set MaxBackupIndex to max number of log files you want to retention for that service. https://community.hortonworks.com/articles/48937/how-do-i-control-log-file-retention-for-common-hdp.html
... View more
11-23-2016
04:31 PM
You need to do a kinit first then and then for example like on webhdfs curl --negotiate -u : http://xxxx.xxxx.xxxx.com:50070/webhdfs/v1/user?op=liststatus
... View more
11-22-2016
03:58 AM
Go to ranger UI. the link is found by going to ambari, then to ranger, and click on quick links. that will give you ranger ui. go to hive. check the user policy to determine if there are any policies give you access to UDF. Another quick way to check if you have ranger enabled is by going to ranger config via ambari and determine if hive plugin is enabled.
... View more
11-22-2016
03:36 AM
@jeannie tan do you have ranger enabled? if so verify you (userid) has access to create and execute UDF
... View more
11-21-2016
08:57 PM
What is a good recommendation on storage size for ranger audit on oracle. I want to store all hdfs, hbase, hive, and kafka audit created by ranger. I don't see anything on recommended db storage size. We are planning on using solr in the near future, but right now need db size info. any insights?
... View more
Labels:
- Labels:
-
Apache Ranger
11-21-2016
04:50 AM
One item I noticed in your comments is your etc host file. your etc hosts file should have lines for all nodes in the cluster
... View more
11-18-2016
04:07 PM
awesome @Binu Mathew
... View more
11-17-2016
05:46 PM
I am trying to enable hbase service controller on NiFi 1.0. The hbase cluster is non kerberized. i have provided full path to core-site.xml and hbase-site.xml 9c-4fe4-1b84-ffff-ffff852f4d04] Simple Authentication
2016-11-17 17:15:36,522 ERROR [StandardProcessScheduler Thread-7] o.a.n.c.s.StandardControllerServiceNode HBase_1_1_2_ClientService[id=9f99699c-4fe4-1b84-ffff-ffff852f4d04] Failed to invoke @OnEnabled method due to java.io.IOException: java.lang.reflect.InvocationTargetException
2016-11-17 17:15:36,527 ERROR [StandardProcessScheduler Thread-7] o.a.n.c.s.StandardControllerServiceNode
java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240) ~[hbase-client-1.1.2.jar:1.1.2]
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218) ~[hbase-client-1.1.2.jar:1.1.2]
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119) ~[hbase-client-1.1.2.jar:1.1.2]
at org.apache.nifi.hbase.HBase_1_1_2_ClientService.createConnection(HBase_1_1_2_ClientService.java:238) ~[nifi-hbase_1_1_2-client-service-1.0.0.2.0.0.0-579.jar:1.0.0.2.0.0.0-579]
at org.apache.nifi.hbase.HBase_1_1_2_ClientService.onEnabled(HBase_1_1_2_ClientService.java:178) ~[nifi-hbase_1_1_2-client-service-1.0.0.2.0.0.0-579.jar:1.0.0.2.0.0.0-579]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_77]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_77]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_77]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_77]
at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:137) ~[na:na]
at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:125) ~[na:na]
at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:70) ~[na:na]
at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:47) ~[na:na]
at org.apache.nifi.controller.service.StandardControllerServiceNode$2.run(StandardControllerServiceNode.java:348) ~[na:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_77]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_77]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_77]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [na:1.8.0_77]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_77]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_77]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_77]
Caused by: java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_77]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_77]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_77]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_77]
at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238) ~[hbase-client-1.1.2.jar:1.1.2]
... 20 common frames omitted
Caused by: java.lang.UnsupportedOperationException: Unable to find org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36) ~[hbase-common-1.1.2.jar:1.1.2]
at org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:58) ~[hbase-client-1.1.2.jar:1.1.2]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2242) ~[hbase-client-1.1.2.jar:1.1.2]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:690) ~[hbase-client-1.1.2.jar:1.1.2]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630) ~[hbase-client-1.1.2.jar:1.1.2]
... 25 common frames omitted
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory
at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[na:1.8.0_77]
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[na:1.8.0_77]
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[na:1.8.0_77]
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[na:1.8.0_77]
at java.lang.Class.forName0(Native Method) ~[na:1.8.0_77]
at java.lang.Class.forName(Class.java:264) ~[na:1.8.0_77]
at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32) ~[hbase-common-1.1.2.jar:1.1.2]
... 29 common frames omitted
I phoenix and Phoenix query server enabled on the cluster. Any insights or feedback appreciated.
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache NiFi
11-17-2016
06:24 AM
Yes on my github repo https://github.com/sunileman/Atlas-API-Examples And the Atlas technical guide has examples of api and kakfa. I recommend using use these integration patterns instead of going against hbase directly. Usingredients kafka u can consume any chance Metadata changes at they occur. rest api is good for consuming Metadata as well, but obviously does not act as a messagingroup service.
... View more