Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2613 | 11-01-2016 05:43 PM | |
| 8737 | 11-01-2016 05:36 PM | |
| 4924 | 07-01-2016 03:20 PM | |
| 8258 | 05-25-2016 11:36 AM | |
| 4428 | 05-24-2016 05:27 PM |
12-25-2015
12:24 PM
1 Kudo
@Gokul Devaraj You can create hdfs user in amabari and the you can create directories on /
... View more
12-24-2015
05:11 PM
@Suresh Raju Please paste screen shot or entries from /var/log/ambari-server/ambari-server.log
... View more
12-24-2015
03:41 PM
4 Kudos
Part 2
Linkedin Post
Extending
Blog 2 to look for starwars tweet
Searching for Yoda, Love & Hate
Let's see Tweets/Data with word YODA related to Starwars tweets
Keyword : LOVE in STARWARS
Source Giphy
Word Hate
Happy Hadooping!!!
... View more
12-24-2015
03:34 PM
7 Kudos
Part 1 Linkedin Post Part 1 - In case you missed the Introduction to Apache NiFi Assumption - HDP and NiFi installation is in place. You can use HDP Sandbox if you don't have the cluster. NiFi installation - You can follow Blog 1 End Goal of this tutorial is to display Tweets related to particular search terms. For example: My twitter id is allaboutbdata and following screenshot shows the tweet sent on Twitter and same tweet in HDFS/Hive and Solr. The whole setup was done using NiFi. Demo: Install HDP search:
yum install -y lucidworks-hdpsearch Create user directory in HDFS & changer permissions sudo -u hdfs hadoop fs -mkdir /user/solr
sudo -u hdfs hadoop fs -chown solr /user/solr
chown -R solr:solr /opt/lucidworks-hdpsearch/solr Setup Solr
su solr
cd /opt/lucidworks-hdpsearch/solr/server/solr-webapp/webapp/banana/app/dashboards/
mv default.json default.json.orig
wget https://raw.githubusercontent.com/abajwa-hw/ambari-nifi-service/master/demofiles/default.json
Important : Must change the hostname if you are not using HDP sandbox , line number 740 Add <str>EEE MMM d HH:mm:ss Z yyyy</str> for tweets timestamp
vi /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs/conf/solrconfig.xml Start Solr in cloud mode and create a collection called tweets
export JAVA_HOME=/usr/jdk64/jdk1.8.0_60/jre/
/opt/lucidworks-hdpsearch/solr/bin/solr start -c -z localhost:2181
/opt/lucidworks-hdpsearch/solr/bin/solr create -c tweets -d data_driven_schema_configs -s 1 -rf 1 Download the Twitter NiFi template from here Import the template by clicking the 3rd icon from the left as show below. Browse and import the xml file that you downloaded. Click X on the extreme right hand side at the Top to close the popup.
Now, Let's load the template. Click 7th icon from the left side and drag it in the canvas
Now, let's configure the Twitter template. Setup Twitter developer account to create an app. Once done then you need following information for GetTwitter processor. Start the flow...... Source Happy Hadoooping!!
... View more
Labels:
12-24-2015
03:31 PM
7 Kudos
Linkedin Post Apache NiFi is an easy to use, powerful, and reliable system to process and distribute data. Apache NiFi is based on technology previously called “Niagara Files” that was in development and used at scale within the NSA for the last 8 years and was made available to the Apache Software Foundation through the NSA Technology Transfer Program. Some of the use cases include, but are not limited to:
Big Data Ingest– Offers a simple, reliable and secure way to collect data streams. IoAT Optimization– Allows organizations to overcome real world constraints such as limited or expensive bandwidth while ensuring data quality and reliability. Compliance– Enables organizations to understand everything that happens to data in motion from its creation to its final resting place, which is particularly important for regulated industries that must retain and report on chain of custody. Digital Security– Helps organizations collect large volumes of data from many sources and prioritize which data is brought back for analysis first, a critical capability given the time sensitivity of identifying security breaches.Source Demo Installation : Download , untar or unzip the package and modify conf/nifi.properties. I added nifi host and modified the port from 8080 to 9080 or deploy NiFi ambari service by using this Nifi UI http://nifihost:9080/nifi/ We are going to work on 3 use cases. Part 1 is focusing very basic use case. 1) Copy files from local filesystem into HDFS Processor - Remember this word because we will be playing with tons of processors while working on use cases. You will "drag" Processor on to the canvas. filter by "getfile" and click Add & then search "hdfs" for put. Now , we have GetFile and PutFile on to the canvas. Right click on the processor to see all the options. In this case, I am copying the data from /landing into HDFS /sourcedata. Right Click on the GetFile processor and it will give you the configuration option. Input directory /landing and in my case , I am keeping source file false. Now, let's configure PutHDFS. Add complete location of core-site.xml and hdfs-site.xml as shown below. You can label the processor as you like by clickingSettings and also, enable failure and success Now, let's setup the relationship between Get and Put. Drag that arrow with + sign to PutHDFS The following screenshot is from my demo environment. Happy Hadoooping!!!
... View more
Labels:
12-24-2015
01:11 PM
@Suresh Raju I was able to reproduce this in my sandbox. You can click Dashboard or refresh or load Hive view again to get it clear.
... View more
12-24-2015
12:57 PM
1 Kudo
@Akshay Shingote Causedby: org.apache.falcon.FalconException: E0501 : E0501:Couldnot perform authorization operation,Unauthorized connection forsuper-user: oozie Check oozie proxyuser settings. If more lax security is preferred, the wildcard value * may be used to allow impersonation from any host or of any user. For example, by specifying as below in core-site.xml, user named oozieaccessing from any host can impersonate any user belonging to any group. <property>
<name>hadoop.proxyuser.oozie.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.oozie.groups</name>
<value>*</value>
</property>
... View more
12-24-2015
12:50 PM
@Sergey Orlov FAILED:HiveException java.security.AccessControlException:Permission denied: user=ms, access=WRITE, inode="":ms:ms:drwxr-xr-x User ms does not have permissions to write hdfs dfs -ls /user/ms if it's not there then hdfs dfs -mkdir -p /user/ms hdfs dfs -chown -R ms:hdfs /user/ms
... View more
12-24-2015
12:47 PM
3 Kudos
@Suresh Raju On Amabari Dashboard , click Yarn --> Quick Link --> RM UI Notice the running jobs and find the application id related to your job login into one of the nodes as hive or hdfs users yarn application -kill <application id>
... View more
12-24-2015
12:45 PM
open stack cloud http://sequenceiq.com/cloudbreak-docs/release-1.1.0/openstack-image/
... View more