Member since 
    
	
		
		
		02-29-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                108
            
            
                Posts
            
        
                213
            
            
                Kudos Received
            
        
                14
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2664 | 08-18-2017 02:09 PM | |
| 4318 | 06-16-2017 08:04 PM | |
| 4493 | 01-20-2017 03:36 AM | |
| 11346 | 01-04-2017 03:06 AM | |
| 5956 | 12-09-2016 08:27 PM | 
			
    
	
		
		
		07-22-2016
	
		
		08:54 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		6 Kudos
		
	
				
		
	
		
					
							 The best way to save dataframe to csv file is to use the library provide by Databrick Spark-csv  It provides support for almost all features you encounter using csv file.   spark-shell --packages com.databricks:spark-csv_2.10:1.4.0  then use the library API to save to csv files  df.write.format("com.databricks.spark.csv").option("header", "true").save("file.csv")
  It also support reading from csv file with similar API  val df = sqlContext.read.format("com.databricks.spark.csv").option("header", "true").option("inferSchema", "true").load("file.csv")  You could also write some custom code to create the output string using mkString, but it won't be safe if you encounter special characters and won't be able to handle quote, etc..  df.map(x => x.mkString("|")).saveAsTextFile("file.csv")  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-12-2016
	
		
		09:25 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 under /etc/hive/conf/conf.server, following ranger files are there, but not ranger-audit.xml  -rwxr--r-- 1 hive hadoop 1208 2016-06-25 16:25 /etc/hive/conf/conf.server/ranger-hive-audit.xml
-rwxr--r-- 1 hive hadoop 1072 2016-06-25 16:25 /etc/hive/conf/conf.server/ranger-hive-security.xml
-rwxr--r-- 1 hive hadoop 1030 2016-06-25 16:25 /etc/hive/conf/conf.server/ranger-policymgr-ssl.xml
-rw-r--r-- 1 hive hadoop   64 2016-06-26 21:39 /etc/hive/conf/conf.server/ranger-security.xml
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-07-2016
	
		
		02:12 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I tried the suggested fix in the old thread and it is not working. Still got the same error with sqoop. Move this to ranger section and hoping some ranger guru jump in.  Thanks for your help  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-06-2016
	
		
		10:01 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I tried different users running sqoop cmd and none worked, including root, ambari-qa, sqoop, hive...  the file /etc/hive/2.5.0.0-817/0/xasecure-audit.xml does not exist. And I did a search of all folders and it doesn't exist on the sandbox VM  cd /
find -name xasecure-audit.xml 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-06-2016
	
		
		09:11 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Ravi, I want to take a look of Atlas/Ranger integration in HDP2.5. So disable Ranger is not really an option. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-06-2016
	
		
		04:05 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 While trying out the sandbox 2.5 TP for some new features in Atlas, I encountered an error when running sqoop cmd to move data from mySQL to Hive (this should trigger a linage generated inside Atlas and that is my goal)  created a simple table inside mysql and then sqoop the data under ambari-qa credential  mysql --host=sandbox.hortonworks.com --user=root --database=hive 
use test; 
create table source_fact (id int, name varchar(100), description varchar(100));
exit;
sqoop import --connect jdbc:mysql://sandbox.hortonworks.com:3306/test --table source_fact --hive-import --hive-overwrite --hive-table atlasdemo.source_fact --username root -m 1
  Then got the following error:  FAILED: RuntimeException java.io.FileNotFoundException: /etc/hive/2.5.0.0-817/0/xasecure-audit.xml (No such file or directory)  this only happens when running sqoop. Hive, Atals and Ranger all looks fine and function as expected.  Thanks,  Note: move this to security/Ranger, look like this is related to Ranger config after looking at some old thread. And based on the old thread, xasecure-audit.xml should not be part of the new version anyway. Looking for some Ranger guru to provide some feedbacks. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
	
					
			
		
	
	
	
	
				
		
	
	
			
    
	
		
		
		05-18-2016
	
		
		09:29 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		4 Kudos
		
	
				
		
	
		
					
							 When using beeline, you need to put the connection string in double quote. The following command should work for you   beeline -u "jdbc:hive2://zk01:2181,zk02:2181,zk03:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2" -n hive -p hadoop  please let me know if that solves you problem.  -Qi 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-05-2016
	
		
		05:12 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @Michel Brown  When install a test cluster, normally I would setup a MySQL instance to use for Ambari, Hive Metastore, oozie and Ranger. Below are the script I used to setup the process and most of them are copied from the document where you could find more details. All script are tested on centod6.5  Install MySQL before Ambari installation  wget http://repo.mysql.com/mysql-community-release-el7-5.noarch.rpm
rpm -ivh mysql-community-release-el7-5.noarch.rpm
yum -y install mysql mysql-server mysql-libs mysql-connector-java*  After After   After Ambari installation, setup the users in MySQL for Ambari, Hive, Oozie and Ranger  mysql -u root -p
CREATE USER 'ambari'@'localhost' IDENTIFIED BY 'password';
GRANT ALL PRIVILEGES ON *.* TO 'ambari'@'localhost';
CREATE USER 'ambari'@'%' IDENTIFIED BY 'password';
GRANT ALL PRIVILEGES ON *.* TO 'ambari'@'%';
FLUSH PRIVILEGES;
CREATE DATABASE ambari;
CREATE USER 'hive'@'localhost' IDENTIFIED BY 'password';
GRANT ALL PRIVILEGES ON *.* TO 'hive'@'localhost';
CREATE USER 'hive'@'%' IDENTIFIED BY 'password';
GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%';
FLUSH PRIVILEGES;
CREATE DATABASE hive;
CREATE USER 'oozie'@'%' IDENTIFIED BY 'password';
GRANT ALL PRIVILEGES ON *.* TO 'oozie'@'%';
FLUSH PRIVILEGES;
CREATE DATABASE oozie;
CREATE USER 'rangerdba'@'localhost' IDENTIFIED BY 'rangerdba';
GRANT ALL PRIVILEGES ON *.* TO 'rangerdba'@'localhost';
CREATE USER 'rangerdba'@'%' IDENTIFIED BY 'rangerdba';
GRANT ALL PRIVILEGES ON *.* TO 'rangerdba'@'%';
GRANT ALL PRIVILEGES ON *.* TO 'rangerdba'@'localhost' WITH GRANT OPTION;
GRANT ALL PRIVILEGES ON *.* TO 'rangerdba'@'%' WITH GRANT OPTION;
FLUSH PRIVILEGES;
mysql -u root -p ambari < /var/lib/ambari-server/resources/Ambari-DDL-MySQL-CREATE.sql
  
During Ambari setup after install Ambari server, choose not to used default database and point to the MySQL instance and setup java jdbc connector  ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar 
 During Ambari install wizard for HDP, choose existing MySQL for Hive and Oozie and point to the MySQL instance you setup. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-05-2016
	
		
		03:35 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 The following document link should be your reference  http://docs.hortonworks.com/HDPDocuments/Ambari-2.2.1.1/bk_Ambari_Users_Guide/content/_how_to_delete_a_component.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-05-2016
	
		
		02:04 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @Raghu Ramamoorthi  I did an installation on CentOS7 very recently with Ambari 2.2.1 and finished without problem. Did you install using the Ambari wizard? Did the problem happened during the last step of the wizard?  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		- « Previous
 - Next »