Member since
01-19-2017
3679
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 785 | 06-04-2025 11:36 PM | |
| 1364 | 03-23-2025 05:23 AM | |
| 675 | 03-17-2025 10:18 AM | |
| 2456 | 03-05-2025 01:34 PM | |
| 1598 | 03-03-2025 01:09 PM |
10-31-2016
05:26 PM
@Gary Cameron Any updates on this ranger problem.
... View more
10-31-2016
05:26 PM
@Gary Cameron You can restart your Ranger setup with the attached a customized document. You will need to ensure you are ROOT when creating the user ranger or whatever user with corresponding password of your choice.
Then log on as ranger to create the ranger database then exit and continue through the Ambari UI you should be fine. Happy Hadooping
... View more
10-31-2016
05:26 PM
The below statement is wrong though it runn successfully. Notice the .* after rangerdb mysql> GRANT ALL PRIVILEGES ON rangerdb.* TO 'ranger'@'nn.example.com' IDENTIFIED BY 'rangerdba'; The below statement should be run mysql> GRANT ALL PRIVILEGES ON rangerdb TO 'ranger'@'nn.example.com' IDENTIFIED BY 'rangerdba';
Then retry
... View more
10-31-2016
05:26 PM
@Gary Cameron Whats the output of below command # hostname -f From MySQL prompt Mysql>show grants; Then run this grants statement after connect like root mysql> use rangerdb;
mysql> GRANT ALL PRIVILEGES ON * . * TO 'rangerdb'@'%';
mysql> Flush privileges;
mysql> quit; And retry it looks a privilege issue
... View more
10-27-2016
07:23 PM
@Roger Young That's now looks a bit tricky with raspberry pis in the picture. 1. Your sandbox should be configure to access the public repo unless you have downlaode the Ambari 2.x,HDP 2.x,HDP-UTILS 1.x etc
2. Have the same version of OS running on Raspberry as the Sandbox
3. Do the basic preparatory configuration for HDP installation.
4. Important is the network setup between the participating nodes otherwise you wont succeed in your installation.
First time someone sets HDP on raspberry. Reference
... View more
10-27-2016
06:39 PM
@Roger Young The tutorial will definitely work but you need to successfully install your standalone or HDF 2.0 Cluster you dont need to add nodes to your cluster if its a single node. Question: Are you deploying a single node or cluster ? Remember the HDF 2.0 installation follows the same preparatory steps like the HDP 2.x upto the Ambari Database install and there after you run the management PACK process which ensure that when you start your Ambari you ONLY have the HDF 2.0 repository available !!!
... View more
10-27-2016
05:50 PM
1 Kudo
@Roger Young HDF 2.0 cannot be installed on an existing HDP cluster ! It is not supported for installing on an Ambari instance with a deployed HDP cluster. HDF 2.0 has it's own Ambari and can be used to create a HDF cluster. See the below Link
... View more
10-26-2016
07:49 PM
@ANSARI FAHEEM AHMED #su - hdfs
$ hdfs dfsadmin -report -dead
... View more
10-26-2016
07:34 PM
@ANSARI FAHEEM AHMED I think the below command should sort you out if you are not in a kerberized environment #su - hdfs
$ hdfs dfsadmin -report -dead
$ hdfs dfsadmin -report -alive The dead and alive nodes will appear
... View more
10-25-2016
07:50 PM
@Sami Ahmad
Look at my code below I did exactly what you wantd to do and it work just copy and substitute the values to correspond with your environment it should work. And this is how you launch it ! Substitute the values to fit your setup /usr/bin/flume-ng agent -c /etc/flume-ng/conf -f /etc/flume-ng/conf/flume.conf -n agent #######################################################
# This is a test configuration created the 31/07/2016
# by Geoffrey Shelton Okot
#######################################################
# Twitter Agent
########################################################
# Twitter agent for collecting Twitter data to HDFS.
########################################################
TwitterAgent.sources = Twitter
TwitterAgent.channels = MemChannel
TwitterAgent.sinks = HDFS
########################################################
# Describing and configuring the sources
########################################################
TwitterAgent.sources.Twitter.type = org.apache.flume.source.twitter.TwitterSource
TwitterAgent.sources.Twitter.Channels = MemChannel
TwitterAgent.sources.Twitter.consumerKey = xxxxxxxx
TwitterAgent.sources.Twitter.consumerSecret = xxxxxxxx
TwitterAgent.sources.Twitter.accessToken = xxxxxxxx
TwitterAgent.sources.Twitter.accessTokenSecret = xxxxxxxx
TwitterAgent.sources.Twitter.Keywords = hadoop,Data Scientist,BigData,Trump,computing,flume,Nifi
#######################################################
# Twitter configuring HDFS sink
########################################################
TwitterAgent.sinks.HDFS.hdfs.useLocalTimeStamp = true
TwitterAgent.sinks.HDFS.channel = MemChannel
TwitterAgent.sinks.HDFS.type = hdfs
TwitterAgent.sinks.HDFS.hdfs.path = hdfs://namenode.com:8020/user/flume
TwitterAgent.sinks.HDFS.hdfs.fileType = DataStream
TwitterAgent.sinks.HDFS.hdfs.WriteFormat = Text
TwitterAgent.sinks.HDFS.hdfs.batchSize = 1000
TwitterAgent.sinks.HDFS.hdfs.rollSize = 0
TwitterAgent.sinks.HDFS.hdfs.rollCount = 10000
#######################################################
# Twitter Channel
########################################################
TwitterAgent.channels.MemChannel.type = memory
TwitterAgent.channels.MemChannel.capacity = 20000
#TwitterAgent.channels.MemChannel.DataDirs =
TwitterAgent.channels.MemChannel.transactionCapacity =1000
#######################################################
# Binding the Source and the Sink to the Channel
########################################################
TwitterAgent.sources.Twitter.channels = MemChannel
TwitterAgent.sinks.HDFS.channels = MemChannel
########################################################
... View more