Member since 
    
	
		
		
		10-01-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                3933
            
            
                Posts
            
        
                1150
            
            
                Kudos Received
            
        
                374
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 3475 | 05-03-2017 05:13 PM | |
| 2857 | 05-02-2017 08:38 AM | |
| 3123 | 05-02-2017 08:13 AM | |
| 3083 | 04-10-2017 10:51 PM | |
| 1572 | 03-28-2017 02:27 AM | 
			
    
	
		
		
		03-08-2017
	
		
		06:57 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Saikiran Parepally in your error  Traceback (most recent call last): File "/usr/sbin/ambari-server.py", line 33, in from ambari_server.dbConfiguration import DATABASE_NAMES, LINUX_DBMS_KEYS_LIST File "/usr/lib/python2.6/site-packages/ambari_server/dbConfiguration.py", line 28, in from ambari_server.serverConfiguration import decrypt_password_for_alias, get_ambari_properties, get_is_secure, \ File "/usr/lib/python2.6/site-packages/ambari_server/serverConfiguration.py", line 36, in from ambari_commons.os_utils import run_os_command, search_file, set_file_permissions, parse_log4j_file ImportError: cannot import name parse_log4j_file  can you check in /usr/lib/python2.6/site-packages/ambari_commons directory whether you have a file called os_utils.py  in that file you should have a function  def parse_log4j_file(filename):  if you don't, you should   yum remove ambari-server ambari-agent
rm -rf /usr/lib/python2.6/site-packages/ambari_commons
yum install ambari-server ambari-agent  then check in that directory again to see if that file (os_utils.py) exists.  I'm also urging you to install Ambari 2.4.2 instead of 2.4.1 as that was found to contain critical bugs. Also, can you tell us what OS you're running?  Depending on the OS, it matters what version of Python you're running. http://docs.hortonworks.com/HDPDocuments/Ambari-2.4.2.0/bk_ambari-installation/content/software_requirements.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-08-2017
	
		
		04:33 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 addendum: the order of install and upgrade of ambari-server matters and it is the same for Ambari 2.4.2 as well. Morale of the story, read the documentation fully http://docs.hortonworks.com/HDPDocuments/Ambari-2.4.2.0/bk_ambari-upgrade/content/upgrade_ambari.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-08-2017
	
		
		04:20 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 I was able to upgrade ambari-server, turns out ambari_commons.logging_utils.py changed print_msg_info() signature from one argument to two and I had ambari-agent still on the old version, the fix is upgrade ambari-server and agent at the same time if both coexist on a server, that way ambari_commons gets updated logging_utils package. In my case, I actually had to reinstall ambari and restore from backup as I messed up the whole thing trying out different scenarios.   this is new function signature https://github.com/apache/ambari/blob/trunk/ambari-common/src/main/python/ambari_commons/logging_utils.py  #
# Prints an "info" messsage.
#
def print_info_msg(msg, forced=False):
  if forced:
    print("INFO: " + msg)
    return
  if _VERBOSE:
    print("INFO: " + msg)
  this is an old one, from 2.4 branch https://github.com/apache/ambari/blob/branch-2.4.0/ambari-common/src/main/python/ambari_commons/logging_utils.py  #
# Prints an "info" messsage.
#
def print_info_msg(msg):
  if _VERBOSE:
    print("INFO: " + msg)
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-08-2017
	
		
		03:26 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Raj B you can create a view on the RDBMS side to get diff, and sqoop on the view 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-08-2017
	
		
		02:33 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 You can do sqoop with where condition to specify your own logic of what to insert, look at free-frm queries but note it's limited to simple queries only https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_free_form_query_imports 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-08-2017
	
		
		10:39 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 You can pass an escaped by clause  Enable escaping for the delimiter characters by using the 'ESCAPED BY' clause (such as ESCAPED BY '\') 
Escaping is needed if you want to work with data that can contain these delimiter characters.   https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL  CREATE TABLE my_table(a string, b string, ...)
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.OpenCSVSerde'
WITH SERDEPROPERTIES (
   "separatorChar" = "\t",
   "quoteChar"     = "'",
   "escapeChar"    = "\\"
)  
STORED AS TEXTFILE;
Default properties for SerDe is Comma-Separated (CSV) file
 
DEFAULT_ESCAPE_CHARACTER \
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-08-2017
	
		
		10:29 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Please use latest HDP 2.5.3 and Ambari 2.4.2 for new installs, HDP 2.3 is going to be deprecated soon.2.3.0 release repos were not stable and perhaps removed in favor of 2.3.6. Do yourself a favor and use latest stable release. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-08-2017
	
		
		08:59 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Your original question was not asking about dependency management, it was open ended, we can help you better if you ask right questions 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-08-2017
	
		
		08:48 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Add two new nodes, then in Ambari there's an option to move namenode in the HDFS section.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-08-2017
	
		
		08:35 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 From https://github.com/apache/spark/blob/master/bin/pyspark  # In Spark 2.0, IPYTHON and IPYTHON_OPTS are removed and pyspark fails to launch if either option is set in the user's environment. Instead, users should set PYSPARK_DRIVER_PYTHON=ipython to use IPython and set PYSPARK_DRIVER_PYTHON_OPTS to pass options when starting the Python driver# (e.g. PYSPARK_DRIVER_PYTHON_OPTS='notebook').  This supports full customization of the IPython# and executor Python executables. 
						
					
					... View more