Member since
09-24-2015
816
Posts
488
Kudos Received
189
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2659 | 12-25-2018 10:42 PM | |
12166 | 10-09-2018 03:52 AM | |
4194 | 02-23-2018 11:46 PM | |
1883 | 09-02-2017 01:49 AM | |
2200 | 06-21-2017 12:06 AM |
03-03-2017
11:18 AM
1 Kudo
Client means a set of binaries and libraries to run commands and develop software for a particular Hadoop service. So, if you install Hive client you can run beeline, if you install HBase client you can run HBase shell and so on. Typically you install all clients on so-called Edge (or Gateway) nodes, used by end-users to access cluster services. Some clients are also installed in the background (by Ambari) on master nodes for related services, like hdfs clients on NNs. Clients are passive components, unlike Data node, Node manager etc. they don't run unless someone starts related binaries.
... View more
03-03-2017
11:05 AM
1 Kudo
Yes, if you have a blueprint file and the cluster creation template file applicable to your cluster. Then you only need a few REST API calls and the install will be silent. Check details here.
... View more
03-03-2017
01:50 AM
1 Kudo
I've just tried the scenario you described on HDP-2.5.3 (Ranger-0.6) and it works, my user1 has only "Select" privilege but could grant all privileges to user2, and he can even grant "all" to himself. I think the idea with "Delegate Admin" is that you can set certain user to be a Ranger admin on given resources. Though, I've never seen this well documented. And actually the User guide for Ranger-0.5 says that The delegated admin can update, delete
the policies. It can also create child policies based on the original
policy (base policy). So, if you want to avoid unexpected surprises you can disable "Delegate Admin" in all policies, and control everything by the central admin.
... View more
03-01-2017
09:57 PM
Great, but next time include all relevant details in you question. We cannot imagine that you are making such basic mistakes.
... View more
03-01-2017
09:43 PM
Try to install the Kerberos client for Windows, it will show you what's your initial ticket principal, and what principals is Windows trying to negotiate with. You might have to provide a custom krb5.ini file, specially if your cluster realm differs from your default AD realm. If nothing else works you can install Knox, and connect to HS2 through Knox, HS2 has to run in http transport mode for that.
... View more
03-01-2017
08:53 AM
1 Kudo
Check have you given to that user UDF permission on all databases, either by user or by his group. I've just discovered that in HDP-2.5.3 if I give UDF permission to u1 on all databases using his group, then u1 can list all databases, and can even do "use db1" even if he has no "table" permission on db1, but "show tables" returns empty list. When I remove his group from UDF policy then it works as expected.
... View more
02-28-2017
11:09 PM
You typically use Oozie shell action as a connector or a data provider between 2 other Oozie actions in your workflow. You can include "capture-output" element in your shell action enabling the next action to read the output from the shell command/script. If all you want to run in Oozie is your Python script, like it seems to be your case, then it's better to use cron and schedule your script to run on a particular node in the cluster. Or you can try to "port" your Python script to Oozie by creating a Sqoop action followed by FS actions to run HDFS commands. Oozie offers many actions you can choose from to develop your apps on Hadoop. See details here, in particular "Workflow functional specs" and Action extensions.
... View more
02-26-2017
12:33 AM
1 Kudo
Here is the link to a short tutorial with Waterline using a HDP Sandbox created jointly by Hortonworks and Waterline: Manage your Data Lake more efficiently with Waterline Data Inventory and HDP
... View more
02-25-2017
11:54 PM
2 Kudos
I saw this happening on a relatively idle cluster. You can create a checkpoint manually, I think the instructions are given on the dialog showing the warning, but here they are: Login to the active Namenode and run su - hdfs
hdfs dfsadmin -safemode enter
hdfs dfsadmin -saveNamespace
hdfs dfsadmin -safemode leave
... View more
02-24-2017
05:05 AM
+1 for a nice article! I had to add "library(ggplot2)" in steps 4 and 6 which provides ggplot function.
... View more