Member since
07-08-2016
260
Posts
44
Kudos Received
10
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3129 | 05-02-2018 06:03 PM | |
6273 | 10-18-2017 04:02 PM | |
2051 | 08-25-2017 08:59 PM | |
2807 | 07-21-2017 08:13 PM | |
10748 | 04-06-2017 09:54 PM |
03-09-2017
02:36 AM
@Matt Clarke i will try that , can you edit your post and remove our server name..i had it by mistake in my original post and removed it.
... View more
03-09-2017
02:36 AM
Hi, Since my "nifi" service account kerberos ticket is expiring each day (24hrs), my putHDFS process is failing and i had to start and stop the process as i mentioned in another post. to fix that issue i am planning to regenerate the ticket once in every 12 hours , so that it will never expire. i am planning to use ExecuteScript or ExecuteStreamCommand for this and i am getting errors. as this is my first time using these processes i feel like i am doing something wrong here and asking for your help. Here are the different ways i tried with..please let me know if this can be done. ExecuteProcess: only command is set kinit command like this kinit -k -t /etc/security/keytabs/nifi.keytab nifi/server@domain.COM 20:26:58 UTC
ERROR
aec3331b-015a-1000-c1e0-4a71379a015b ExecuteProcess[id=aec3331b-015a-1000-c1e0-4a71379a015b] Failed to create process due to java.io.IOException: Cannot run program "kinit -k -t /etc/security/keytabs/nifi.keytab nifi/server@domain.COM": error=2, No such file or directory: java.io.IOException: Cannot run program "kinit -k -t /etc/security/keytabs/nifi.keytab nifi/server@domain.COM": error=2, No such file or directory ExecuteStreamCommand: only command path is set kinit command like this kinit -k -t /etc/security/keytabs/nifi.keytab nifi/server@domain.COM 20:29:24 UTC
ERROR
aed90433-015a-1000-527f-016723eb1b0c ExecuteStreamCommand[id=aed90433-015a-1000-527f-016723eb1b0c] ExecuteStreamCommand[id=aed90433-015a-1000-527f-016723eb1b0c] failed to process due to org.apache.nifi.processor.exception.ProcessException: java.io.IOException: Cannot run program "kinit -k -t /etc/security/keytabs/nifi.keytab nifi/server@domain.COM": error=2, No such file or directory; rolling back session: org.apache.nifi.processor.exception.ProcessException: java.io.IOException: Cannot run program "kinit -k -t /etc/security/keytabs/nifi.keytab nifi/server@domain.COM": error=2, No such file or directory
... View more
Labels:
- Labels:
-
Apache NiFi
01-25-2017
07:55 PM
1 Kudo
Nevermind... PutHiveQL fixed it..but i need to use a ReplaceText to produce a file with the ALTER TABLE command.
... View more
01-25-2017
07:42 PM
Hi, I created an external table in Hive with partitions pointing to a HDFS location. I am using NiFi to load csv files into that location by using putHDFS. and then i am using SelectHiveQL to execute the ALTER TABLE command. It is successfully able to load the partitions as i can query them from Hive, but its routing to failures as the ALTER TABLE is not producing any output. Here is the error.. SelectHiveQL[id=7c2b4a39-f772-1ea7-7e5b-48be29a34850] Unable to execute HiveQL select query ALTER TABLE hdf_moat.tbl_moat_display_sai ADD PARTITION (MOATDATE=20150412) LOCATION '/user/putarapasa/MOAT_Daily/20150412/' for StandardFlowFileRecord[uuid=9781e831-a494-4512-8121-7bed28908763,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1485365330956-237, container=default, section=237], offset=0, length=50994078],offset=0,name=GroupM_Nestle_Daily_20150412.csv,size=50994078] due to org.apache.nifi.processor.exception.ProcessException: java.sql.SQLException: The query did not generate a result set!; routing to failure: org.apache.nifi.processor.exception.ProcessException: java.sql.SQLException: The query did not generate a result set! how can we handle this.?? Regards, Sai
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache NiFi
01-11-2017
07:58 PM
Hi @aawasthi, suppose i have an external table "test_display" which points to '/user/test/display' which already have 30 files one for each day. and i can query the table. when i created the table i did not specify any partition. now like you mentioned above i tried to execute the command below ALTER TABLE test_display ADD PARTITION (moat_date='2017-01-05') LOCATION '/user/test/display' Error occurred executing hive query: Error while compiling statement: FAILED: ValidationFailureSemanticException table is not partitioned but partition spec exists: {moat_date=2017-01-05} do i have to manually create the folder 2017-01-05 under '/user/test/display'
... View more
01-11-2017
03:37 PM
Hi, i created an external table in HIVE with 150 columns. i have a .csv file for each day , and eventually i will have to load data for 4 years. i just loaded one month worth of files which turned into 2mill rows. i now like to partition the table by date (which first column in the table and file). 1.how can i partion on existing table and 2. can i do it dynamically when i load other files. Regards,Sai
... View more
Labels:
- Labels:
-
Apache Hive
12-20-2016
04:10 PM
1 Kudo
@D'Andre McDonald Here is what I did , which is basically what is described above..i have 80 files going in to 17 folders on a different server.(hoping Nifi user has access to the remote folder). list and fetch will be your GetFile and then in my first UpdateAttribute I am extracting the filename (since my filename has other info which I am not interested in)..my destination folder on the remote system depends on my file name. if you can route your files based on your incoming file name as it is you can ignore this step . incoming file name to first UpdateAttribute is export-20150218000000-20150219000000-FLGFUP04_LIQ_DA_ASP_VLV_DDR_F_CV Out going file name is FLGFUP04_LIQ_DA_ASP_VLV_DDR_F_CV my second UpdateAttribute process is where I set the Folder based on file name..here I have used Advanced tab but you can use properties also. basically I am wrote all my rules here..setting the destination Folder based on file names. and finally in my in your put file you can use that property and create the folder dynamically.. Hope this helps... Sai
... View more
12-08-2016
06:47 PM
Hi, I see in the documentation in the link below , there an option to append https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.hadoop.PutHDFS/ but in my PutHDFS i don't see that option .. i am Version HDF-2.0.0.0-579. any idea why.?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache NiFi
12-07-2016
04:21 PM
@kkawamura, Thank you. i will try that. so with above example it zips all the files under source-files into one zip file.? also is there anyway that i can process this at subfolder level.? in my case , under a folder "January" i will have 31 subfolders 01012016 to 01312016 (a folder for each day with 8000 files in each folder). if i point the above command at folder "January" it will try to zip all 31 subfolders into one which may come nearly 80G and it may be difficult for me to transfer to HDFS thru site2site. i was looking at 1 zip for each subfolder. so in this case i will endup with 31 zip files. Thanks again, Sai
... View more
12-06-2016
09:52 PM
Hi, we have a use case where we need to save the source files as it is. for each day we will have a folder with 1000 files. we will process 1000 files and send required 100 files to HDFS. but we also want to compress and save the whole folder for any future references.(we like to keep the source as it is at a folder level not at file level) is there anyway i can do this using NiFi.? Regards, Sai
... View more
Labels:
- Labels:
-
Apache NiFi