Member since 
    
	
		
		
		01-23-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                114
            
            
                Posts
            
        
                19
            
            
                Kudos Received
            
        
                4
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2810 | 03-26-2018 04:53 AM | |
| 31309 | 12-01-2017 07:15 AM | |
| 1259 | 11-28-2016 11:30 AM | |
| 2187 | 10-25-2016 11:26 AM | 
			
    
	
		
		
		06-30-2021
	
		
		02:13 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @ayayay2333, as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-14-2020
	
		
		03:06 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @ansharma1   You can run the following query in Ambari DB  SELECT view_instance_id,resource_id,view_name, cluster_handle,cluster_type FROM viewinstance;  Above query will show that the view which is causing the problem might not be associated with any cluster_handle. (cluster_handle is basically the cluster_id, which you can see in the clusters table).  If cluster_handle for a view is not correctly updated then you might see that kind of message:      org.apache.ambari.server.view.IllegalClusterException: Failed to get cluster information associated with this view instance     If you want to use the same old View to work fine  (instead of creating a new Instance of that view) then you might have to make sure to update the cluster_handle for that view instance is set correctly.  Like    1. Take ambari DB dump (latest dump for backup), As we are going to change the DB manually.    2. Stop ambari-server    3. Run the following queries in the amabri DB. NOTE: Following is just a dummy query the values for 'cluster_handle' and 'view_instance_id' in that query may vary.  UPDATE viewinstance SET cluster_handle = 4 WHERE view_instance_id=3;       
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-30-2019
	
		
		11:22 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi,     Did you get answer for your question? We are also facing the same issue. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-23-2018
	
		
		02:29 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 This article discuss the process related to Oozie Manual Sharelib update and the prerequisites for Spark Oozie Sharelib  Copy the sharelib to a local directory   
# mkdir oozie_share_lib 
# hadoop fs -copyToLocal <current-share-lib-directory> oozie_share_lib/lib  To update oozie sharelib once the existing oozie sharelib copied from HDFS to local as above:  /usr/hdp/current/oozie-client/bin/oozie-setup.sh sharelib create -fs /user/oozie/share/lib/ -locallib oozie_share_lib/  This will create a new sharelib including SPARK Oozie sharelib:  the destination path for sharelib is: /user/oozie/share/lib/lib_20180502070613
Fixing oozie spark sharelib
Spark is locally installed at /usr/hdp/2.6.3.0-235/oozie/../spark
Renaming spark to spark_orig in /user/oozie/share/lib/lib_20180502070613
Creating new  spark directory in /user/oozie/share/lib/lib_20180502070613
Copying Oozie spark sharelib jar to /user/oozie/share/lib/lib_20180502070613/spark
Copying oozie_share_lib/lib/spark/oozie-sharelib-spark-4.2.0.2.6.3.0-235.jar to /user/oozie/share/lib/lib_20180502070613/spark
Copying local spark libraries to /user/oozie/share/lib/lib_20180502070613/spark
Copying local spark python libraries to /user/oozie/share/lib/lib_20180502070613/spark
Copying local spark hive site to /user/oozie/share/lib/lib_20180502070613/spark  But from the corresponding HDFS folder we can see that the spark lib's were not added to the SPARK Oozie share lib:  $ hadoop fs -ls /user/oozie/share/lib/lib_20180502070613/spark
Found 1 items
-rwxrwxrwx   3 oozie hadoop  191121639 2018-05-02 07:18 /user/oozie/share/lib/lib_20180502070613/spark/spark-assembly-1.6.3.2.6.3.0-235-hadoop2.7.3.2.6.3.0-235.jar  It means Oozie Sharelib update is not working as expected for SPARK, even though it shows Spark is locally installed at /usr/hdp/2.6.3.0-235/oozie/../spark   But the spark client was not installed on the node from where oozie sharelib update command was run no-spark-client-installed.png  And from the node where the SPARK-CLIENT installed OOZIE Sharelib update  does properly update the Spark Oozie Share Lib:  the destination path for sharelib is: /user/oozie/share/lib/lib_20180502064112
Fixing oozie spark sharelib
Spark is locally installed at /usr/hdp/2.6.3.0-235/oozie/../spark
Renaming spark to spark_orig in /user/oozie/share/lib/lib_20180502064112
Creating new  spark directory in /user/oozie/share/lib/lib_20180502064112
Copying Oozie spark sharelib jar to /user/oozie/share/lib/lib_20180502064112/spark
Copying oozie-new-sharelib/lib/spark/oozie-sharelib-spark-4.2.0.2.6.3.0-235.jar to /user/oozie/share/lib/lib_20180502064112/spark
Copying local spark libraries to /user/oozie/share/lib/lib_20180502064112/spark
Ignoring file /usr/hdp/2.6.3.0-235/oozie/../spark/lib/spark-examples-1.6.3.2.6.3.0-235-hadoop2.7.3.2.6.3.0-235.jar
Copying /usr/hdp/2.6.3.0-235/oozie/../spark/lib/datanucleus-core-3.2.10.jar to /user/oozie/share/lib/lib_20180502064112/spark
Copying /usr/hdp/2.6.3.0-235/oozie/../spark/lib/spark-assembly-1.6.3.2.6.3.0-235-hadoop2.7.3.2.6.3.0-235.jar to /user/oozie/share/lib/lib_20180502064112/spark
Ignoring file /usr/hdp/2.6.3.0-235/oozie/../spark/lib/spark-hdp-assembly.jar
Copying /usr/hdp/2.6.3.0-235/oozie/../spark/lib/datanucleus-rdbms-3.2.9.jar to /user/oozie/share/lib/lib_20180502064112/spark
Copying /usr/hdp/2.6.3.0-235/oozie/../spark/lib/datanucleus-api-jdo-3.2.6.jar to /user/oozie/share/lib/lib_20180502064112/spark
Copying local spark python libraries to /user/oozie/share/lib/lib_20180502064112/spark
Copying /usr/hdp/2.6.3.0-235/oozie/../spark/python/lib/pyspark.zip to /user/oozie/share/lib/lib_20180502064112/spark
Copying /usr/hdp/2.6.3.0-235/oozie/../spark/python/lib/py4j-0.9-src.zip to /user/oozie/share/lib/lib_20180502064112/spark
Ignoring file /usr/hdp/2.6.3.0-235/oozie/../spark/python/lib/PY4J_LICENSE.txt
Copying local spark hive site to /user/oozie/share/lib/lib_20180502064112/spark
Copying /etc/spark/conf/hive-site.xml to /user/oozie/share/lib/lib_20180502064112/spark  From here we can see that Oozie is able to pick up the files from /usr/hdp/2.6.3.0-235/spark/conf/  to HDFS /user/oozie/share/lib/lib_20180502064112/spark where we have the spark-client installed spark-client-installed.png  $ hadoop fs -ls /user/oozie/share/lib/lib_20180502064112/spark
Found 8 items
-rw-r--r--   3 oozie hdfs     339666 2018-05-02 06:41 /user/oozie/share/lib/lib_20180502064112/spark/datanucleus-api-jdo-3.2.6.jar
-rw-r--r--   3 oozie hdfs    1890075 2018-05-02 06:41 /user/oozie/share/lib/lib_20180502064112/spark/datanucleus-core-3.2.10.jar
-rw-r--r--   3 oozie hdfs    1809447 2018-05-02 06:41 /user/oozie/share/lib/lib_20180502064112/spark/datanucleus-rdbms-3.2.9.jar
-rw-r--r--   3 oozie hdfs       1918 2018-05-02 06:41 /user/oozie/share/lib/lib_20180502064112/spark/hive-site.xml
-rw-r--r--   3 oozie hdfs      23278 2018-05-02 06:41 /user/oozie/share/lib/lib_20180502064112/spark/oozie-sharelib-spark-4.2.0.2.6.3.0-235.jar
-rw-r--r--   3 oozie hdfs      44846 2018-05-02 06:41 /user/oozie/share/lib/lib_20180502064112/spark/py4j-0.9-src.zip
-rw-r--r--   3 oozie hdfs     358253 2018-05-02 06:41 /user/oozie/share/lib/lib_20180502064112/spark/pyspark.zip
-rw-r--r--   3 oozie hdfs  191121639 2018-05-02 06:41 /user/oozie/share/lib/lib_20180502064112/spark/spark-assembly-1.6.3.2.6.3.0-235-hadoop2.7.3.2.6.3.0-235.jar  With this, to have properly updated Spark Oozie share lib we need to have Spark client to be installed from the node/server where we are running the Oozie Share lib update manually. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
	
					
			
		
	
	
	
	
				
		
	
	
			
    
	
		
		
		04-18-2018
	
		
		01:20 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @heta desai  
	you can use the parameters based on your environment and here is the details that gives details about LDAP error codes.  Thanks  Venkat 
  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-20-2018
	
		
		11:28 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 This has been identified as a BUG in SPARK 2.2. which is fixed in SPARK 2.3 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-10-2018
	
		
		07:09 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							@ssathish Isn't it only for the currently running jobs? Do we able to see the job completed jobs containers and details.  Here is the Running job that shows Total Allocated Containers:running-containers.png  Here is the Completed Job that shows Total Allocated Containers: finished-job.png  But none of these Total Allocated Containers the get transformed to the REST API of RM. Below given XML's will show only the allocated contain  Running Job XML: running.xml  Finished Job XML: finished-job.xml  And the Node REST API:   curl http://<Nodemanager address>:<port>/ws/v1/node/containers/<containerID>  gives the containers details about only the running containers not about the completed containers.  Is there a way what we see on YARN Application UI https://manag003:8090/cluster/appattempt/appattempt_1522212350151_40488_000001 for the Total Allocated Containers:   to be transformed to REST API.  Thanks  Venkat   
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-04-2018
	
		
		08:27 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Saumil Mayani Thanks a lot for the details. That makes it more clear. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-11-2017
	
		
		04:58 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							@Abhijit Nayak As given by @Jay Kumar SenSharma the JIRA ( https://issues.apache.org/jira/browse/AMBARI-19666 ) was a bug in Ambari 2.4.0, but your Ambari version is 2.5.0.3 which is fixed in this release as per the JIRA, please check the below as given by @Jay Kumar SenSharma  Also it might be a browser setting which might be interrupting the complete file download in between.  So please try using a different browser to see if the behaviour is persistent? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-11-2017
	
		
		02:30 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Manfred PAUL  Im my last reply I actually meant to include a link to this article:  https://community.hortonworks.com/articles/110104/fixing-hiveserver-interactive-llap-failures-with-e.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













