Member since 
    
	
		
		
		05-02-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                88
            
            
                Posts
            
        
                173
            
            
                Kudos Received
            
        
                15
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 7404 | 09-27-2017 04:21 PM | |
| 3389 | 08-17-2017 06:20 PM | |
| 3073 | 08-17-2017 05:18 PM | |
| 3649 | 08-11-2017 04:12 PM | |
| 5124 | 08-08-2017 12:43 AM | 
			
    
	
		
		
		08-08-2017
	
		
		12:43 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 I have solved this issue, By adding hbase-site.xml and core-site.xml files to the phoenix jars,  As Squirrel doesn't take hbase-site.xml and core-site.xml files directly to classpath, Squirrel tries to unzip them like normal jar files.  By extracting phoenix jar and added hbase-site.xml and core-site.xml files to the jar again created new jar with same name.  Added it to the Squirrel-sql Lib directory and restarted the Squirrel-sql.  After this I am able to connect to phoenix using Squirrel-Sql.  Thank you very much for your help.  @Sergey Soldatov and @Josh Elser 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-04-2017
	
		
		06:39 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Enis   Thanks for the quick response.  After setting TTL in table we have to run the Major to delete older-than-TTL-time data right?  How to do this. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-04-2017
	
		
		05:56 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hadoop
			
    
	
		
		
		08-04-2017
	
		
		05:39 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache HBase
			
    
	
		
		
		08-03-2017
	
		
		11:35 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							 I am getting below error while doing the initial sync for ambari LDAP  ambari-server sync-ldap --users /home/centos/users.txt
Using python /usr/bin/python Syncing with LDAP...
Enter Ambari Admin login: admin
Enter Ambari Admin password:
Syncing specified users and groups.ERROR:
Exiting with exit code 1.
REASON: Sync event creation failed.
Error details: HTTP Error 502: Bad Gateway   I am using internal proxy server, So I setup some configuration in ambari-env.sh for this,  export AMBARI_JVM_ARGS=$AMBARI_JVM_ARGS' -Xms512m -Xmx2048m -XX:MaxPermSize=128m -Djava.security.auth.login.config=$ROOT/etc/ambari-server/conf/krb5JAASLogin.conf -Djava.security.krb5.conf=/etc/krb5.conf -Djavax.security.auth.useSubjectCredsOnly=false -Dhttp.proxyHost=FQDN -Dhttp.proxyPort=8080 -Dhttp.nonProxyHosts="FQDN|localhost|127.0.0.1"'  LDAPsearch command works fine. Same configs I have added it in ambari.properties.  After setting this, still getting 502: Bad Gateway error 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Ambari
			
    
	
		
		
		08-03-2017
	
		
		11:27 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							 I have added below block in knox topology  <service>
<role>HIVE2</role>
<url>http://FQDN_LLAP_SERVER:10501/cliservice</url>
</service>  Also, created the directory in "$KNOX_HOME/data/services/hive2" with service.xml and rewrite.xml files.  Also enabled below properties in Hiveserver2-Interactive-site.xml file,  hive.server2.thrift.http.path=cliservice
hive.server2.transport.mode=http  service.xml  <service role="HIVE2" name="hive2" version="0.13.0">
    <routes>
        <route path="/hive2"/>
    </routes>
    <dispatch classname="org.apache.hadoop.gateway.hive.HiveDispatch" ha-classname="org.apache.hadoop.gateway.hive.HiveHaDispatch"/>
</service>
  rewrite.xml  <rules>
    <rule dir="IN" name="HIVE2/hive2/inbound" pattern="*://*:*/**/hive2">
        <rewrite template="{$serviceUrl[HIVE2]}"/>
    </rule>
</rules>
  Getting below error in knox while accessing this path from ODBC driver,  hadoop.gateway Failed to match path /hive2 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hive
- 
						
							
		
			Apache Knox
			
    
	
		
		
		07-28-2017
	
		
		07:25 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Sergey Soldatov   Thanks for response,  Which directories do you want me to add in the SQL lib.  I guess it should be /usr/hdp/current/hbase-master/lib/hbase*.jar and /usr/hdp/current/hadoop-client/hadoop*.jar.  Please confirm. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-28-2017
	
		
		04:26 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Josh Elser   I tried Both getting same Error.  jdbc:phoenix:zk-host-1,zk-host-2,zk-host-3:2181:/hbase-secure:user1@EXAMPLE.COM:/Users/user1/user1.headless.keytab
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-28-2017
	
		
		02:20 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Getting Below Error:  java.util.concurrent.TimeoutException
	at java.util.concurrent.FutureTask.get(FutureTask.java:205)
	at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.awaitConnection(OpenConnectionCommand.java:132)
	at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand.access$100(OpenConnectionCommand.java:45)
	at net.sourceforge.squirrel_sql.client.mainframe.action.OpenConnectionCommand$2.run(OpenConnectionCommand.java:115)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
  I am using Below URL for connection,  jdbc:phoenix:zk-host-1,zk-host-2,zk-host-3:2181:/hbase-secure:/Users/user1/user1.headless.keytab:user1@EXAMPLE.COM  Please help? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache HBase
- 
						
							
		
			Apache Phoenix
			
    
	
		
		
		07-25-2017
	
		
		11:29 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		5 Kudos
		
	
				
		
	
		
					
							 @ed day  Hey you don't need to worry about the admin user stuff as I can see you have the "/user/admin" directory present in the HDFS with owner "admin".  Just go to HDFS, login to hdfs user and change the ownership of directory for 'ed' as,  # su hdfs
# hdfs dfs -chown ed:hdfs /user/ed  Let me know if this helps. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
         
					
				













