Member since
01-19-2017
3676
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 473 | 06-04-2025 11:36 PM | |
| 1002 | 03-23-2025 05:23 AM | |
| 531 | 03-17-2025 10:18 AM | |
| 1876 | 03-05-2025 01:34 PM | |
| 1240 | 03-03-2025 01:09 PM |
04-02-2016
07:15 AM
@Brian Weissler Can you try adding 'ranger-hbase-plugin-enabled' property to :
"HBASE" -> "Advanced" -> "Custom ranger-hbase-plugin-properties" section
and set the value to yes/no, depending on whether HBase plugin was enabled
on Ambari or not.
Please restart HBase service after doing the changes.
... View more
03-30-2016
07:50 PM
6 Kudos
@Randy Gelhausen Apache NiFi is a tool to build a dataflow pipeline just the right tool for Internet of thing (iOT),Internet of Everything (IoE) or any data in motion
using inbuilt connectors (known as processors in NiFi world) so it can Get/Put data from/to HDFS, Hive, RDBMS, Kafka etc. out of the box. It also has really cool & user friendly interface which can be used to build the dataflow in minutes by dragging and dropping processors.
Sqoop on the other hand is designed for efficiently transferring bulk data between Apache Hadoop and structured datastores such as relational databases,so depending on the application you are building you have a choice of tools at hand.
... View more
03-24-2016
12:30 PM
@darkz yu Aprt from the problem you are encountering could you do the following 1. All steps in Prepare the environment 2. Follow the attached Java setup and retry install-the-oracle-jdk-1.pdf
... View more
03-21-2016
04:31 PM
1 Kudo
@David Tam
Here is hortonworks documentation I read go to page 19 Each account must have a user ID that is greater than or equal to 1000. In the /etc/hadoop/conf/taskcontroller.cfg file, the default setting for the banned.users property is mapred, hdfs, and bin to prevent jobs from being submitted via those user accounts. The default setting for the min.user.id property is 1000 to prevent jobs from being submitted with a user ID less than 1000, which are conventionally Unix super users. Hortonworks Hope that explains it
... View more
03-21-2016
09:49 AM
@rpatil Just a reminder you will surely get more useful help in here for HDP and not CDH , if a specific details is not answered then hunt down some Cloudera forum they could be more specific
... View more
03-15-2016
09:11 AM
@Divya Gehlot This could save you handler
... View more
03-11-2016
08:21 AM
@Harshal Joshi For the Host-config-is-in-invalid-state. Please have a look at this post great API's for changing the state of a service component Link
... View more
03-11-2016
05:32 AM
1 Kudo
@Prakash Punj Definitely all the steps I attached MUST be implemented but if the Ambari server had already locked that port for use then subsequent attempts would't work untill the process using that Ambari port is killed . Happy it worked out for you.
... View more
03-10-2016
06:49 PM
@Ram D To kill an Application you can also use the Application State by using a PUT operation to set the application state to KILLED. For example: curl -v -X PUT -d '{"state": "KILLED"}''http://localhost:8088/ws/v1/cluster/apps/application_1409421698529_0012' Hope that helps
... View more