Member since
08-15-2016
189
Posts
63
Kudos Received
22
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5662 | 01-02-2018 09:11 AM | |
3001 | 12-04-2017 11:37 AM | |
2146 | 10-03-2017 11:52 AM | |
21565 | 09-20-2017 09:35 PM | |
1601 | 09-12-2017 06:50 PM |
10-24-2024
12:46 AM
Is it possible to enable SSO and MFA in ambari 2.6.5 without Knox being enabled? Any pointer will be appreciated.
... View more
09-16-2024
05:15 AM
I have configured one consumer group_id associated with multiple (around 10) topics listening and producing messages to each other - sometimes not listening messages and producer saying produced successfully Note : It's failing listening randomly like sometimes service1 won't listen sometimes service1 listen and service2 won't listen and so on
... View more
03-25-2020
05:31 AM
Is it possible to define a STRUCT element that has an @ sign at the beginning, e.g. "@site" : "Los Angeles" We can live with having the column actually show up as site rather than @site. If we can't do it in the HiveQL syntax then we will have to preprocess the JSON to remove the @ sign, which would be annoying but do-able.
... View more
01-06-2020
06:55 AM
Is there a way to setup a Kafka container without ambari and then add kafka broker to ambari ?
... View more
11-21-2019
04:32 AM
Another +1 from me for the response. Spend couple of hours investigating and comparing configuration with working cluster before I removed the path. The worst part is that I was not able to find any indication in kafka/zookeeper logs that there is something wrong.
... View more
11-22-2018
11:07 AM
I created /etc/hive/conf/beeline-hs2-connection.xml and it worked. Thanks
... View more
10-11-2018
08:13 AM
I have the same problem on Nifi 1.5 and would be very interested to get the solution to get Nifi's En(De)crypt processor to work with PGP. In the meantime I turned to another solution by using the ExecuteStreamCommand processor and outsource the decryption to the CLI which is verified to work: Just be aware that you have to import the pub and private keys into the /home/nifi/.gnupg folder of the nifi user since that is the one executing the stream command. So you might have to run these commands (on every Nifi node!) first: gpg --import < pub_keys_armor.pgp
gpg --import < priv_key_armor.pgp
... View more
06-20-2018
04:24 AM
@Jasper Hi, As far as I know verifiable consumer is designed for system testing and it emits consumer events as JSON objects. Also, --group-id is a mandatory option and even if you check the code it uses subscribe method to subscribe to partitions so I don't think we can mention specific partition. Thank you!
... View more
06-20-2018
04:29 AM
@Jasper Hi, Looks like you are hitting: https://issues.apache.org/jira/browse/KAFKA-6130 and its fixed in Kafka 1.1.0. As you are using HDF 3.1.1, it comes with Kafka 1.0.0. Thank you!
... View more
03-28-2019
02:01 PM
@Matt Burgess It works fine if there is just one object in the input tree if there are more it makes them as an array rather than separate records. Like {
"agent_submit_time" : [ -1, -1 ],
"agent_end_time" : [ 123445, 123445 ],
"agent_name" : [ "Marie Bayer-Smith", "Marie Bayer-Smith" ]
} I would like to to be something like [
{
"agent_submit_time" : -1,
"agent_end_time" : 123445,
"agent_name" : "Marie Bayer-Smith"
},
{
"agent_submit_time" : -1,
"agent_end_time" : 123445,
"agent_name" : "Marie Bayer-Smith"
}
] How to do that. I tried but I couldnt replaceing "*": "&" with "@": "[&]" makes it separate but the transformation of - to _ doesnt takes place.
... View more