Member since
03-03-2017
74
Posts
9
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2577 | 06-13-2018 12:02 PM | |
4634 | 11-28-2017 10:32 AM |
11-21-2017
03:19 PM
Hi, Im doing sqoop import of msybase database tables into orc hive tables, can i specify the location my orc file is created in this cmd i have tried --hcatalog-home /bla/bla and --location /bla/bla it wont work #!/bin/bash
sqoop import "-Dorg.apache.sqoop.splitter.allow_text_splitter=true" \
--driver net.sourceforge.jtds.jdbc.Driver \
--connect jdbc:jtds:sybase://10.9.179.5:4000/db1 \
--username foo \
--P \
--query "select * from table1 where CAST(csrp_koersel_dto as DATE)>'2017-11-18' AND \$CONDITIONS" \
--split-by cpr_nr --hcatalog-database default --hcatalog-table sqoop_hccatalog_example_new1 --create-hcatalog-table --hcatalog-storage-stanza "stored as orcfile"
... View more
Labels:
- Labels:
-
Apache Sqoop
10-30-2017
03:14 PM
Hi Is there a way in NIFI to dertermine the charset of a flowfile, and store the value in a attribute We want to make sure that all our file are stored on hadoop with a certain charset. Best Regards
... View more
Labels:
- Labels:
-
Apache NiFi
10-12-2017
01:11 PM
1 Kudo
@Shu Thank you so very much for that really good answer.
... View more
10-12-2017
12:33 PM
1 Kudo
Hello, I am ingesting files with names like I_XXX_20171002024210_LU2016DKABCDE12345-2017-10-02-TESTDAC2-000003_xxxxxxx.json I_SSSSS_20171003024653_US2016US3661159602CC8FC7-4963-4304-A211-D8BCDA152625_xxxxxxx.json I want to update/set a flowfile attribute using regular expression on the filename Is there a nifi expression language function that support that. ie. ${filename:<regexec:()>} This regular expression extract the date yyyymmdd in capture group #2 from the filename \_([A-Za-z]{3,5})\_([0-9]{8}).*xxxxxxx.json
... View more
Labels:
- Labels:
-
Apache NiFi
10-10-2017
02:15 PM
@Abdelkrim Hadjidj This was exactly what i was looking for, thankyou very much for this beautiful and simple aproach
... View more
10-04-2017
01:01 PM
Hi, When we ingest data we receive a control file for each data file. The controle file is a json file having a md5 value in it. The md4 hash value should be the same as the file were currently ingesting other wise we will not ingest it. So i have done following until now 1, Feching a data1.xml and data1_control.json from sftp server 2. used RouteOnAttrubute to split the flow up in two one for data1.xml file and one for the control file 3. used HashContent to get the3 md5 had value from the data12,xml file 4. used EvaluateJsonPath to get the md4 tag into a flowfile attribute Now i got stuck, i tried to put my control file md5 value into PutDistributedMapCache and used detectDuplicate, but it wouldnt work How can this be solved ?
... View more
Labels:
- Labels:
-
Apache NiFi
09-22-2017
09:32 AM
This data is transfered from CSV to Avro and from avro to json in order
to fetch one columns value in flowfile attribute and route on that value
But my EvaluateJsonPath processor cannot recognize the path log out put 2017-09-22 11:10:32,402 WARN [Timer-Driven Process Thread-9] o.a.n.p.standard.EvaluateJsonPath EvaluateJsonPath[id=d7e53856-262e-12a7-b651-7eff7b6fd820] FlowFile 6908917 could not find path $['zip'] for attribute key zip.: com.jayway.jsonpath.PathNotFoundException: Property ['zip'] not found in path avro schema attribute and example data json format inferred.avro.schema
{ "type" : "record", "name" : "CSRPrecord", "doc" : "Schema generated by Kite", "fields" : [ { "name" : "name", "type" : "string", "doc" : "Type inferred from 'Simon jespersen'" }, { "name" : "adresse", "type" : "string", "doc" : "Type inferred from 'Brejning S.ndetrgade 66'" }, { "name" : "zip", "type" : "long", "doc" : "Type inferred from '7080'" }, { "name" : "by", "type" : "string", "doc" : "Type inferred from 'B.rkop'" } ] }
example data content of flowfile
[{"name": "else", "adresse": "route 66", "zip": 7000, "by": "Hortownworks City"},{"name": "Karina", "adresse": "Route 66", "zip": 7001, "by": "Hadoop City"},{"name": "Luke Skyewalker", "adresse": "some where in the universe", "zip": 1111, "by": "SinCity"},{"name": "Superman", "adresse": "Krypto street 1", "zip": 0001, "by": "Metroplitan"}]
I cannot understand why i cannot evalutae the json path
<br>
look at the attachment picture to see my EvaluateJsonPath configuration
... View more
Labels:
- Labels:
-
Apache NiFi
09-22-2017
06:00 AM
Hi, I got a large csv file where fields are seperated by tabulator and have approx 150 columns I need to filter this file and split it into two routes based on a column value status = 6 or status = 7 I have the data schema so i know each fields. But what would be the best approach to do this. I was thinking on 1.convert it into avro with InferAvroSchema -> ConvertCSVToAvro add status value to attribute and route it with RouteOnAttribute, but i cannot find a processor to parse avro fields to attributes. 2. convert it into json and parse values to attribute, but this seems to be a complex matter to configure Im am running NiFi - Version 1.1.0.2.1.2.0-10 Is there a best practice for handling this.
... View more
Labels:
- Labels:
-
Apache NiFi
09-15-2017
08:07 AM
Hi @Wynner thanks for the reply I created the jaas file on my nifi host jaas-client.config KafkaClient <br>[root@sktudv01hdf01 nifi]# cat /etc/nifi/jaas-client.config
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=true
keyTab="/etc/security/keytabs/kafka.service.keytab"
principal="kafka/_HOST@CCTA.DK";
};
[root@sktudv01hdf01 nifi]# My kafka configuration is my nifi publisher is configured like this The log file looks like this 2017-09-15 10:03:34,878 WARN [Timer-Driven Process Thread-10] o.a.n.p.kafka.pubsub.PublishKafka PublishKafka[id=7a067740-015e-1000-ffff-ffffaeaac0ec] Processor Administratively Yielded for 1 sec due to processing failure
2017-09-15 10:03:34,878 WARN [Timer-Driven Process Thread-10] o.a.n.c.t.ContinuallyRunProcessorTask Administratively Yielding PublishKafka[id=7a067740-015e-1000-ffff-ffffaeaac0ec] due to uncaught Exception: org.apache.kafka.common.KafkaException: Failed to construct kafka producer
org.apache.kafka.common.KafkaException: Failed to construct kafka producer
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:321) ~[na:na]
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:156) ~[na:na]
at org.apache.nifi.processors.kafka.pubsub.PublisherPool.createLease(PublisherPool.java:61) ~[na:na]
at org.apache.nifi.processors.kafka.pubsub.PublisherPool.obtainPublisher(PublisherPool.java:56) ~[na:na]
at org.apache.nifi.processors.kafka.pubsub.PublishKafka.onTrigger(PublishKafka.java:312) ~[na:na]
Caused by: org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. Make sure -Djava.security.auth.login.config property passed to JVM and the client is configured to use a ticket cache (using the JAAS configuration setting 'useTicketCache=true)'. Make sure you are using FQDN of the Kafka broker you are trying to connect to. not available to garner authentication information from the user
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:74) ~[na:na]
at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:60) ~[na:na]
at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:79) ~[na:na]
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:271) ~[na:na]
at org.apache.kafka.common.security.kerberos.Login.login(Login.java:298) ~[na:na]
at org.apache.kafka.common.security.kerberos.Login.<init>(Login.java:104) ~[na:na]
at org.apache.kafka.common.security.kerberos.LoginManager.<init>(LoginManager.java:44) ~[na:na]
at org.apache.kafka.common.security.kerberos.LoginManager.acquireLoginManager(LoginManager.java:85) ~[na:na]
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:55) ~[na:na]
^C
[w20960@sktudv01hdf01 nifi]$ It seems that my JAAS file doesnt work properly
... View more
09-14-2017
01:50 PM
@Wynner we are on Powered by Apache NiFi - Version 1.1.0.2.1.2.0-10
... View more