Member since
05-16-2016
270
Posts
18
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1717 | 07-23-2016 11:36 AM | |
3053 | 07-23-2016 11:35 AM | |
1563 | 06-05-2016 10:41 AM | |
1157 | 06-05-2016 10:37 AM |
07-22-2017
02:07 AM
Checked through oozie dashboard. That's the same and only error I get there too.
... View more
07-04-2017
07:12 PM
@Simran Kaur I believe currently oozie does not support the functionality of tagging jobs with prioirities Check the following links for more details Oozie Workflow Functional Spec OOZIE-2892
... View more
06-16-2017
08:07 AM
Hi @Simran Kaur If you want to use this within the script, you can do the following. set hivevar:DATE=current_date;
INSERT OVERWRITE DIRECTORY '/user/xyz/reports/oos_table_sales/${DATE}' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' SELECT * FROM outputs.oos_table_sale; Cheers, Sagar
... View more
06-02-2017
12:13 PM
2 Kudos
You're right, ListenHttp cannot be used as I said... I was thinking about the use of HandleHttpRequest and HandleHttpResponse. With ListenHttp you cannot actually access multiple URLs. An example could be: https://pierrevillard.com/2017/01/31/nifi-and-oauth-2-0-to-request-wordpress-api/ Hope this helps.
... View more
06-01-2017
11:21 AM
1 Kudo
Simran, you can merge single JSON objects into a larger file before you put it to HDFS. There is a dedicated processor for this: Merge Content https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.2.0/org.apache.nifi.processors.standard.MergeContent/index.html The processor also allows you to configure a property specifying the number of JSON you want to be merged into one single file: 'Minimum Number of Entries' As a side note, when you have a processor on your canvas, you can right click on it and go to 'Usage' to display the documentation of the processor. Hope that helps.
... View more
06-01-2017
11:23 AM
@Simran Kaur I had a feeling your issue was related to a missing config. Glad to hear you got it working. If this answer addressed your original question, please mark it as accepted. As far as your other question goes, I see you already started a new question (https://community.hortonworks.com/questions/105720/nifi-stream-using-listenhttp-processor-creates-too.html). That is the correct approach in this forum, we want to avoid asking unrelated questions in the same post. I will have a look at that post as well. Thank you, Matt
... View more
05-31-2017
01:51 PM
3 Kudos
@Simran Kaur All ports 1024 and below are considered reserved as privileged ports and can be only used/bound to by process run by the root user. NiFi can use these ports if it is running as the root user. The alternative is to setup your ListenHTTP processor to run on a non privileged port and then set up port forwarding in your iptables to redirect incoming requests to a privileged port to that non privileged port you are using in NiFi: iptables -t nat -A PREROUTING -p tcp --dport 80 -j REDIRECT --to-ports 8081 Thanks,
... View more
04-20-2017
05:56 PM
Hi @Simran Kaur, There is no way that first column can be considered as column name. But if the structure changes the its better to load the data as AVRO or Parquet file in hive. Even if the structure changes there is no need for you to change the old data and new data can be inserted into the same hive table. Points to be noted: 1.External table has to be used 2.You might need a stage table before loading into External hive table which should be in avro/parquet format Steps: 1. Create external table with columns which you have as avro/parquet. 2. Load the csv into stage table and then load the stage data into external table hive table. 3. If the columns changes then drop the external table and re-create with additional fields. 4. Insert the new file by following steps 1-2 By this way there will not be any manually work needed to modify the existing data as avro by default will show 'null' for columns which are available in the table but not in the file. The only manually work is to drop and re-create the table ddl. Let me know if you needed any details. And if you feel it answers your question then please accept the answer
... View more
09-06-2017
10:58 AM
1 Kudo
try to lock the table before and then unlock it
LOCK TABLE tablename SHARED;
UNLOCK TABLE tablename;
... View more