Member since
04-18-2016
28
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
997 | 07-11-2016 01:47 AM |
10-30-2018
06:20 AM
You can build the custom processor on top of the nifi 1.2 version only and keep the nar file with the same name. It would work the same way. I have done it in my code.
... View more
09-05-2018
08:21 PM
Adding few more details to the query as in the current scenario , the data is encrypted using AWS KMS-Managed Keys (SSE-KMS) only using client side encryption. The client (Nifi) need to download the encrypted object from Amazon S3 along with the cipher blob version of the data encryption key stored as object metadata. The client then sends the cipher blob to AWS KMS to get the plain-text version of the key so that it can decrypt the object data. Please have a look to the option 1 of the below link: https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingClientSideEncryption.html Not sure if we can do this with the existing Nifi processor with some customization OR we need to create a new processor altogether. Please suggest.
... View more
09-04-2018
10:51 AM
Currently i have created a process group which would get the events from SQS via getSQS processor and retrieve the S3 data from fetchS3Object processor. But now data is encrypted
before placing in S3 and my Nifi need to decrypt the file collected using an AWS key which would be changing
every 3 month so it should refresh the key by making an API call to AWS KMS. I didn't find any document to encounter this scenario. Please share some hint or processor which could handle this situation.
... View more
Labels:
- Labels:
-
Apache NiFi
08-25-2018
09:29 AM
Current Nifi flow in which the above mentioned error is thrown in the GetSQS processor. Thanks, Bandhu Gupta
... View more
08-25-2018
09:23 AM
I am facing connectivity issue from Apache Nifi to Amazon SQS & S3: I have given the details in the below link: https://community.hortonworks.com/questions/214811/connection-issue-in-integrating-apache-nifi-with-a.html Thanks in Advance. Please suggest.
... View more
08-25-2018
07:36 AM
Facing issue when trying to connect with SQS from Nifi using the getSQS and fetchS3Object processor. Once all the details are entered and the processor is started , it throws an error as: 04:04:13 AEST - nifi-$nodename:9091 -
ERROR GetSQS[id=ef15f694-0164-1000-0000-00002ea13324] Failed to
receive messages from Amazon SQS due to com.amazonaws.SdkClientException:
Unable to execute HTTP request: Connection reset:
com.amazonaws.SdkClientException: Unable to execute HTTP request: Connection
reset
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache NiFi
03-24-2017
04:28 AM
I am using Upper & lower function in my hive query . So just wanted to know if that effects the performance .
... View more
- Tags:
- Hadoop Core
- HDFS
- performance
- Upgrade to HDP 2.5.3 : ConcurrentModificationException When Executing Insert Overwrite : Hive
Labels:
- Labels:
-
Apache Hadoop
02-02-2017
08:59 AM
Is there a way to add gear configuration button on the ambari 2.2.2.0 as well
... View more
02-02-2017
07:27 AM
I am currently using Ambari 2.2.2.0 and don't want to upgrade my ambari so is there a way to add this option in the current Ambari
... View more
Labels:
07-11-2016
01:47 AM
Yeah i ignored the warning and used /var for all of the rest operation and it worked without any issue.
... View more
07-05-2016
04:31 AM
Any Responses ?
... View more
07-01-2016
01:29 PM
I need to integrate the Qlik sense with Hive server2 running in my cluster
... View more
Labels:
- Labels:
-
Apache Hive
06-29-2016
06:00 AM
I am giving access to my users but didn't want to allow the access of Interpreter tab and Configuration tab.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Zeppelin
06-10-2016
12:41 AM
We are facing the above error while we were using the below query sqoop import --connect jdbc:oracle:thin:@XX:1521/DATABASENAME --username USER --password PWD --table SCHEMANAME.TABLENAME --hive-import --hive-table TABLENAME --hive-overwrite --num-mappers 1 --as-parquetfile It's an issue when we are using Parquet and trying to ingest data in hive only because if we do the ingestion in hdfs with parquet , it gets completed.
... View more
06-09-2016
11:04 AM
We are only getting the above error if we are using parquet otherwise my table get pulled in hive easily .Please keep in mind that we have installed the HDP without internet so it's highly possible that we missed some thing.
... View more
06-09-2016
10:34 AM
sqoop import --connect jdbc:oracle:thin:@XXX:XXXX/YYYY --username YYYYY --password YYYYY --table A.BBBB --hive-import --hive-database default --hive-table test15 --as-parquetfile -m 1 Job job_1465371735536_0055 failed with state FAILED due to: Job commit failed: java.lang.IllegalArgumentException: Wrong FS: ____file:/tmp/default/.temp/job_1465371735536_0055/mr/job_1465371735536_0055/b402d4ba-1a16-46bc-92c6-91fe141070d2.parquet, expected: hdfs://lxapp5524.dc.corp.telstra.com:8020 at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:646)
... View more
06-08-2016
01:11 PM
I am getting this issue when using sqoop with parquet
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Sqoop
06-08-2016
01:36 AM
This is our machine for POC and creating a single node cluster so we installed Ambari and when we are adding our machine as a host , it gets added with the above warnings. Please suggest.
... View more
06-08-2016
01:14 AM
1 Kudo
error.pngissue-sandpit.pngI have attached the snapshot , please suggest as currently we are installing HDP 2.4.0.0 on a test environment but Ambari shows this warning.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
01-05-2016
10:49 AM
1 Kudo
I have data in a SQL Server RDBMS. The data is in French and I need to save that data on hdfs. I also need the data translated into English.
... View more
Labels:
- Labels:
-
Apache Hadoop