Member since
07-17-2017
43
Posts
6
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2368 | 03-24-2019 05:54 PM | |
3288 | 03-16-2019 04:51 PM | |
3288 | 03-16-2019 04:15 AM | |
1348 | 08-04-2018 12:44 PM | |
2177 | 07-23-2018 01:35 PM |
10-27-2017
12:18 PM
1 Kudo
@Paras Mehta I've been dealing with the same issue for a couple of weeks. The work around that I found was to use the Phoenix Query Server and JDBC Thin Client instead. It doesn't require any of the hadoop resources. However there does appear to be a performance penalty for large numbers of inserts. I'm still trying to track down if it's possible to add the hbase-site.xml to the NiFi class path as hinted at in the Hive Connection Pool but that wouldn't work if you have multiple Hadoop Clusters you're working with. Based on my research the last couple of weeks the NiFi community seems to be pretty anti Phoenix anyway so expect to have to fight with all of the processors due to the slight changes in syntax.
... View more
08-22-2017
05:40 PM
What does the query look like in the "db.view"? How many joins, formats of underlying table, where clauses etc. If you're just doing a straight "select * from table limit 10" I'd expect it to return almost immediately but any kind of processing will take a few seconds. Have you tried LLAP/Interactive Queries?
... View more
08-15-2017
05:24 PM
If it works with a test file then it has to be a syntax error in your real data. Unfortunately without seeing the real statements there isn't anyway for us to troubleshoot. I'd try running some of the sql statements from the flowfile in beeline and see if you get a better error.
... View more
08-15-2017
03:31 PM
1 Kudo
@Bala Vignesh N V This is a perfect use case for the Hive Regular Expression Serde. Below is an example external table that could read this. If you're not familiar with regular expressions take a look here. With this serde each capture group is a column and I've basically told it to look for any set of characters other than the | or , delimiters. For additional information about the Hive Regex Serde see here. create external table example_table (
col1 string,
col2 string,
col3 string
)
row format serde 'org.apache.hadoop.hive.serde2.RegexSerDe'
with serdeproperties ('input.regex'='^([^,\|]+),([^,\|]+),([^,\|]+)\|([^,\|]+)$')
stored as textfile;
... View more
08-15-2017
03:19 PM
There has to be an error in your sql statement. The insert statement is in your flowfile correct? Can you create a dummy text file with insert statements that demonstrates the error. If not then there has to be a syntax error in the actual insert statement. I'll also need a screenshot of your PutHiveQL configuration.
... View more
08-14-2017
06:29 PM
This won't actually work as the Hive and Ambari database structure doesn't support group replication. Several tables are missing primary keys and will lead to problems in replication. See Group Replication Requirements.
... View more
08-14-2017
02:46 PM
There are actually several tables missing Primary Keys all related to transactions. I've submitted HIVE-17306 to address this as currently I don't know the impact of adding Primary Keys to all of these tables. On HDP 2.6 it appears that the following tables don't have Primary Keys. On another note I'm not sure you can add surrogate keys as it looks like we have some unqualified inserts in Transaction Metastore that don't explicity list what columns are being inserted. This prevents just adding an extra column. completed_txn_components next_compaction_queue_id next_lock_id next_txn_id txn_components write_set
... View more
08-12-2017
01:00 PM
1 Kudo
It turns out this caused by updating OpenJDK while NiFi is running. I didn't notice one of the admins had run updates earlier and a restart of NiFi made the issue go away.
... View more
08-10-2017
04:02 PM
I'm using the current build of NiFi 1.3 with an HDP 2.6.1.0 Development Cluster without Kerberos and the Hive Streaming Processor is returning an error about no jaas_unix in java.library.path when it tries to connect to the Hive Thrift Server. I'm able to connect to Hive otherwise in NiFi with the PutHiveQL processor so I'm not sure what's going on.Here is the full error nifi-error.txt
... View more
Labels:
07-27-2017
02:15 PM
According to this post HDP uses UTC as default but a simple Hive Statement like this and multiple JIRA Issues proves that isn't true. select concat('Example: ',cast(cast('2017-03-12 02:00:00' as timestamp) as string));
Example: 2017-03-12 03:00:00
Can someone provide guidance on how to set the JVMs Timezone?
... View more
- « Previous
- Next »