Member since
11-02-2015
6
Posts
9
Kudos Received
0
Solutions
04-26-2017
03:01 PM
For Spotfire 7.6 and HDP 2.6, Below are the Jars required. phoenix-core-4.7.0.2.6.0.3-8.jar netty-all-4.0.23.Final.jar hadoop-common.jar hadoop-auth.jar zookeeper.jar htrace-core-3.1.0-incubating.jar commons-collections-3.2.2.jar hbase-server.jar commons-configuration-1.6.jar hbase-common.jar hbase-protocol.jar hbase-client.jar twill-discovery-api-0.6.0-incubating.jar twill-zookeeper-0.6.0-incubating.jar tephra-api-0.6.0.jar tephra-core-0.6.0.jar
... View more
04-26-2017
03:01 PM
9 Kudos
Tibco/Spotfire JDBC Connection to Phoenix on Kerberized Cluster Versions : Tibco Spotfire : 7.5 Phoenix: 4.4 HDP : 2.4.x Issues: Phoenix Client Jars from HDP cluster has dependency conflicts with Spotfire Jars. Step 1: Use the Phoenix core jar with below dependencies to avoid the conflicts 1) hbase-client 2) phoenix core 3) betty-all 4) zookeeper 5) hbase-protocol 6) commons-lang 7) commons-configuration 😎 protobuf-java 9) hbase-server 10) guava 11) hadoop-auth 12) hadoop-common 13) commons-collections 14) htrace-core 15) phoenix-core 16) hbase-common Step 2: Add the krb5.conf file from the cluster to JDK_HOME/jre/lib/security/ that Spotfire is using. The JDK could be embedded in tomcat of spotfire or external. Step 3: Add hbase-site.xml, core-site.xml & hdfs-site.xml to a folder on the spotfire server. And add the folder to the class path in tomcat startup script. In Windows, this is tomcat service bat file. Step 4: On Windows, service has to be removed and installed again using the below Through command line go to <TIBCO_HOME>\tss\7.6.0\tomcat\bin a) service remove b) service install Step 5: In Spotfire add the Data Source using the below JDBC url and save the connection Jdbc:phoenix:<zookeeper_quorum>:2181:/hbase-secure:<principal>:<local_path_to_keytab_file>/hbase.headless.keytab
... View more
Labels:
05-04-2016
10:16 PM
By default Hive is UTF-8, I would recommend doing a sqoop import to Hive table directly. Define a Table DDL and Import into table. Yes you can use string for nvarchar.
... View more
05-02-2016
02:19 PM
Hive doesn't have the limit on the max row count. You said its external partitioned table, did you add partitions using
MSCK REPAIR TABLE (or ALTER TABLE RECOVER PARTITIONS)
... View more