Member since
05-28-2019
46
Posts
16
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6728 | 01-04-2017 07:03 PM | |
1396 | 01-04-2017 06:00 PM |
01-04-2017
07:03 PM
3 Kudos
Use Regex Serde to create the external table like below CREATE EXTERNAL TABLE customers (userid STRING, fb_id STRING, twitter_id STRING, status STRING)
ROW FORMAT SERDE ‘org.apache.hadoop.hive.contrib.serde2.RegexSerDe’
WITH SERDEPROPERTIES (“input.regex” = “(.{10})(.{10})(.{10})(.2})” ) LOCATION ‘path/to/data’; Reference: http://www.confusedcoders.com/bigdata/hive/use-hive-serde-for-fixed-length-index-based-strings
... View more
01-04-2017
06:00 PM
1 Kudo
Did u run the import and initialization of hive context like below ? import org.apache.spark.sql.hive.orc._
import org.apache.spark.sql._ val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc) If you still face issues, check if hive is running.
... View more
01-03-2017
06:39 PM
This has been accepted as a bug by the support team and tracked in the internal JIRA.
... View more
12-16-2016
04:16 PM
One of the column datatype is VARCHAR(15), LATIN connection string - jdbc:teradata://<host>/database=<db>,charset=UTF8 We notice special characters in the column imported in hive
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
12-16-2016
04:04 PM
Ya I tried it and that doesn't help.
... View more
12-15-2016
08:20 PM
Example oracle RAW datatype value “175D78D86FAFE6D19C631AF3BFC246EB” and the sqoop converts that data into string. The data has spaces when I check the hdfs file using "hive –orcdump –d <hdfsfile>”. The RAW dataype is expressed in hexadecimal on oracle side and sqoop adds space to the hdfs file like “17 5D 78 D8 6F AF E6 D1 9C 63 1A F3 BF C2 46 EB”. Has anyone faced this ? RAW datatype is supported by sqoop in the documentation - https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html I see BFILE and LONG RAW datatypes are not supported in sqoop documentation. Do we have a workaround for this ?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
04-13-2016
01:34 AM
2 Kudos
TASK [metron_pcapservice : Check for hbase-site] *******************************
ok: [node1] TASK [metron_pcapservice : include] ********************************************
skipping: [node1] TASK [metron_pcapservice : include] ********************************************
included: /Users/nbalaji-elangovan/Downloads/incubator-metron-Metron_0.1BETA_rc7/deployment/roles/metron_pcapservice/tasks/pcapservice.yml for node1 TASK [metron_pcapservice : Create Metron streaming directories] ****************
ok: [node1] => (item={u'name': u'lib'})
ok: [node1] => (item={u'name': u'config'}) TASK [metron_pcapservice : Copy Metron pcapservice jar] ************************
fatal: [node1]: FAILED! => {"changed": false, "failed": true, "msg": "could not find src=/Users/nbalaji-elangovan/Downloads/incubator-metron-Metron_0.1BETA_rc7/metron-streaming/Metron-Pcap_Service/target/Metron-Pcap_Service-0.1BETA-jar-with-dependencies.jar"}
PLAY RECAP *********************************************************************
node1 : ok=262 changed=140 unreachable=0 failed=1
Ansible failed to complete successfully. Any error output should be
visible above. Please fix these errors and try again.
... View more
Labels:
- Labels:
-
Apache Metron
01-07-2016
04:21 AM
1 Kudo
Labels:
12-08-2015
03:22 PM
It doesn't support creation and deletion of topics. I assume that adding a user to Ranger for kafka will provide producer and consumer authorization for kafka. If that is not case, I need to be enable native acls.
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache Ranger
- « Previous
-
- 1
- 2
- Next »