Member since
07-08-2016
260
Posts
44
Kudos Received
10
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2474 | 05-02-2018 06:03 PM | |
5023 | 10-18-2017 04:02 PM | |
1614 | 08-25-2017 08:59 PM | |
2192 | 07-21-2017 08:13 PM | |
8817 | 04-06-2017 09:54 PM |
06-09-2017
06:44 PM
1 Kudo
@Saikrishna Tarapareddy @Sushanth Sowmyan One way to handle this is to use another JSONSerde that has support for skipping malformed records. I had exactly similar situation and worked around it by using org.openx.data.jsonserde.JsonSerDe. This SerDe has support for handling malformed JSON. You need to create table with following TBLPROPERTIES. 'ignore.malformed.json'='true'
and ensure this Serde is available to Hive by using ADD JAR command before using this Serde or modifying your Hive JAR path.
... View more
04-18-2017
11:30 AM
This is a longer regex, assumed the log_entry meets 2 ip address displayed.
... View more
04-06-2017
05:50 PM
5 Kudos
@Saikrishna Tarapareddy - I think you accidentally asked the same question twice: https://community.hortonworks.com/questions/93488/help-with-hive-regex-extract.html Can you please close this one.
... View more
04-06-2017
09:54 PM
I think I found the answer..looks like we need to use double slashes in hive.. this is working when I replaced log_entry with the text from above.. select
regexp_extract(log_entry, '[A-Z][a-z]+\\s\\d+\\s\\d+:\\d+:\\d+', 0) ,
regexp_extract(log_entry, '\\d+\\.\\d+\\.\\d\\.\\d', 0) ,
regexp_extract(log_entry, '%ASA-6-106100', 0) ,
regexp_extract(log_entry, '\\w+-\\w+\\s+\\w+-\\w+', 0)
... View more
04-04-2017
03:41 PM
Looks like only the Name nodes are opened for connectivity to\from HDF server , the data nodes are not . We are trying to fix and will test after that.
... View more
03-29-2017
02:29 PM
Hi @Matt Burgess i know nifi uses port 8080..i was wondering if Jetty it self runs on a different port. thank you.
... View more
03-09-2017
06:51 PM
@Saikrishna Tarapareddy
In addition to the answer submitted by @Matt Clarke, ExecuteProcess and ExecuteStreamCommand should work as well. However, you'll want to move the arguments you're passing to kinit to the "Command Arguments" properties in the respective processors. The "Command" property should be set to "kinit" (or "/usr/bin/kinit", the full path to the executable can be provided). The "Command Arguments" property should be set to -k -t /etc/security/keytabs/nifi.keytab nifi/server@domain.COM The "Argument Delimiter" should be set to the space character, since you do not have any embedded spaces in the arguments you're using, or you can use the ";" character, for instance. In that case, "Command Arguments" should be set to -k;-t;/etc/security/keytabs/nifi.keytab;nifi/server@domain.COM
... View more
01-25-2017
08:42 PM
Correct, SelectHiveQL is for statements that return ResultSets (like SELECT *), those results are converted to Avro records. PutHiveQL is for executing statements (except callable statements like stored procedures) that do not return results, such as your ALTER TABLE example.
... View more
02-01-2017
07:06 PM
Hi @Saikrishna Tarapareddy Can you provide output of below? 1. create table def hive> Show create table test_display 2. Files in the hdfs location hadoop fs -ls <table_display table hdfs location>
... View more
12-08-2016
06:50 PM
The documents at the link above are for Apache NiFi 1.1.0, but HDF 2.0.0 was built with NiFi 1.0.0. The ability to append was added to NiFi 1.1.0 under NIFI-1322, so will likely be available in an upcoming version of HDF. The docs at that site are always for the latest version of Apache NiFi, it is recommended that you use the docs that come with your version of HDF/NiFi, using the Help option from the top-right hamburger menu in your running instance of HDF/NiFi.
... View more