Member since
04-05-2019
8
Posts
0
Kudos Received
0
Solutions
08-09-2019
07:46 AM
Please find below link https://medium.com/rahasak/kafka-zookeeper-cluster-on-kubernetes-43a4aaf27dbb
... View more
08-09-2019
07:33 AM
I am trying to connect to Hive services on HDP-3.1.0 kerberose cluster through Superset Application using Hive Connection String hive://hive@xx.xx.xx.xx:10000/test?auth=KERBEROS&kerberos_service_name=hive We have tried by adding conf files and keytab files inside /etc/and /etc/security/keytab in master as well as worker node but issue still persist We tried adding the same in container but we don't have root access to add those files. Below is the kubernetes cluster version information for ubuntu Client Version: version.Info{Major:"1", Minor:"13", GitVersion:"v1.13.2", GitCommit:"cff46ab41ff0bb44d8584413b598ad8360ec1def", GitTreeState:"clean", BuildDate:"2019-01-10T23:35:51Z", GoVersion:"go1.11.4", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"13", GitVersion:"v1.13.3", GitCommit:"721bfa751924da8d1680787490c54b9179b1fed0", GitTreeState:"clean", BuildDate:"2019-02-01T20:00:57Z", GoVersion:"go1.11.5", Compiler:"gc", Platform:"linux/amd64"} We are facing below issue ERROR: {"error": "Connection failed!\n\nThe error message returned was:\nCould not start SASL: b'Error in sasl_client_start (-1) SASL(-1): generic failure: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (No Kerberos credentials available (default cache: FILE:/tmp/krb5cc_1000))'"}
... View more
Labels:
04-09-2019
05:48 AM
Data Collector has a stage library for HDP 3.1, but not specifically 3.0 - see the documentation on additional stage libraries.
... View more
04-09-2019
05:40 AM
Data Collector has a stage library for HDP 3.1, but not specifically 3.0 - see the documentation on additional stage libraries.
... View more
04-06-2019
06:44 PM
We have HDP 3.0.1 kerberised cluster. I am trying to ingest data from JDBC to Hive through Streamsets Data Collector. I am getting below error: com.streamsets.pipeline.api.base.OnRecordErrorException: HIVE_17 - Information type missing or invalid in the metadata record: Record[headers='HeaderImpl[select * from shipping WHERE event_id >${offset} ORDER by event_id ;::rowCount:0:1]' data='Field[MAP:{event_id=Field[INTEGER:1], order_id=Field[INTEGER:123], event_type=Field[STRING:SHIPPED]}]'] at com.streamsets.pipeline.stage.destination.hive.HiveMetastoreTarget.write(HiveMetastoreTarget.java:200) Does HDP 3.0.1 kerberised cluster supports Stream set data collector 3.8.0 and is compatible with Hive 3.1. If yes which part i am missing?
... View more
Labels:
04-05-2019
10:07 AM
Hi, We are trying to build a pipeline for reading data from JDBC (Source) and Hive Metastore (Destination) in settings General Tab ==> Stage Library we choose Hive 2.1-HDP 2.6.2 1-1 (as it don't have VERSION Matching to our) We have below configuration 1) HDP :: 3.0.1 2) Hive : 3.1 3) SDC : 3.8.0 Just have single record in JDBC table. in preview mode or after running pipeline getting below error : com.streamsets.pipeline.api.base.OnRecordErrorException: HIVE_17 - Information type missing or invalid in the metadata record: Record[headers='HeaderImpl[select * from shipping WHERE event_id >${offset} ORDER by event_id ;::rowCount:0:1]' data='Field[LIST_MAP:{event_id=Field[INTEGER:1], order_id=Field[INTEGER:123], event_type=Field[STRING:SHIPPED]}]'] at com.streamsets.pipeline.stage.destination.hive.HiveMetastoreTarget.write(HiveMetastoreTarget.java:200)
... View more
Labels: