Member since
05-16-2016
270
Posts
18
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1717 | 07-23-2016 11:36 AM | |
3057 | 07-23-2016 11:35 AM | |
1567 | 06-05-2016 10:41 AM | |
1162 | 06-05-2016 10:37 AM |
04-03-2018
06:17 PM
I copied hdfs-site.xml and core-site.xml from hadoop master node. However, it has private IP and local file system references in it. I guess I need to replace private FQDN to public one? Apart from this, what about local file system it is referencing to? Can I have a sample hdfs-site.xml and core-site.xml that I can use in PUTHDFS processor for remote HDFS server? Edit: I have replaced private FQDN with public one and I get p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #000000; background-color: #ffffff} span.s1 {font-variant-ligatures: no-common-ligatures} ERROR [StandardProcessScheduler Thread-6] o.apache.nifi.processors.hadoop.PutHDFS PutHDFS[id=487275f5-3155-3e96-6742-77d854d67d43] HDFS Configuration error - org.apache.hadoop.net.ConnectTimeoutException: 1000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=publicfqdn.compute.amazonaws.com/172.31.x.x:8020]: {}
... View more
Labels:
- Labels:
-
Apache NiFi
09-26-2017
08:58 AM
I understand I need to use ExtractText but it does not output anything to executeSQL processor. How do I do that? I have not added anything additional in the config of the ExtractText processor but mapped both matched and unmatched queue to the executSQL processor. WHat do I pass in SQL select query in executeSQL processor? Please suggest.
... View more
08-18-2017
07:31 AM
Thanks Bryan. So, I get the query parameters in the exam same format now but I would rather have them saved in JSON structure in files. Is there a way to do that as well? @Bryan Bende
... View more
08-11-2017
12:55 PM
I need to have a URL that would be a listener to a web hook. I will register a URL with them and they will be sending GET requests that look like this Fri Aug 11 17:24:13 IST 2017:Code:405, URL:http://serverURL:port/gushupsmslistener?externalId=3380112592180823371-525731073514300237&deliveredTS=1502452450000&status=SUCCESS&cause=SUCCESS&phoneNo=919977545965&grpName=&errCode=000, timetaken:577, EntityId:1014907793, CauseId:3380112592180823371, InternalId:525731073514300237, PhoneNo:919971295965, globalErrCode:000 What is the best way in Nifi to get data from a GET request? Edit: I've set up HandleHttpRequests processor and started it. Gave it an allowed path and the port. Now, I need to get whatever values are there in the query parameters and store those in HDFS. HOw do I extract and store them? If I directly point handleHTTPrequest to putHDFS, It creates an empty file. Besides, what exactly is upstream connection in handlehttpresponse and how do I set it up?
... View more
Labels:
- Labels:
-
Apache NiFi
07-22-2017
02:07 AM
Checked through oozie dashboard. That's the same and only error I get there too.
... View more
07-22-2017
02:00 AM
@Viswa: I get " Log aggregation has not completed or is not enabled.
... View more
07-21-2017
10:32 AM
Starting yesterday, I have faced this problem multiple times with difference queries: What seems wrong here? I am able to figure out nothing here but it is coming up in different queries that used to work just fine. I have checked and temp directory has the required space and my nodes are healthy too. In stderr, I get: INFO : Hadoop job information for Stage-11: number of mappers: 1; number of reducers: 1
INFO : 2017-07-21 15:52:16,549 Stage-11 map = 0%, reduce = 0%
INFO : 2017-07-21 15:52:23,783 Stage-11 map = 100%, reduce = 0%, Cumulative CPU 4.23 sec
INFO : 2017-07-21 15:52:33,181 Stage-11 map = 100%, reduce = 100%, Cumulative CPU 6.89 sec
INFO : MapReduce Total cumulative CPU time: 6 seconds 890 msec
INFO : Ended Job = <a href="IP:8888/jobbrowser/jobs/job_1499692338187_45811">job_1499692338187_45811</a>
INFO : Starting task [Stage-20:MAPREDLOCAL] in serial mode
ERROR : Execution failed with exit status: 1
ERROR : Obtaining error information
ERROR :
Task failed!
Task ID:
Stage-20
Logs:
ERROR : <a href="http://ec2-35-154-150-43.ap-south-1.compute.amazonaws.com:8888/filebrowser/view=/var/log/hive/hadoop-cmf-CD-HIVE-XCVXskZf-HIVESERVER2-ip-172-31-4-192.ap-south-1.compute.internal.log.out">/var/log/hive/hadoop-cmf-CD-HIVE-XCVXskZf-HIVESERVER2-ip-172-31-4-192.ap-south-1.compute.internal.log.out</a>
ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask
INFO : MapReduce Jobs Launched:
INFO : Stage-Stage-1: Map: 1 Reduce: 1 Cumulative CPU: 5.59 sec HDFS Read: 3487128 HDFS Write: 192 SUCCESS
INFO : Stage-Stage-7: Map: 1 Reduce: 1 Cumulative CPU: 7.18 sec HDFS Read: 3484698 HDFS Write: 1572 SUCCESS
INFO : Stage-Stage-11: Map: 1 Reduce: 1 Cumulative CPU: 6.89 sec HDFS Read: 3484445 HDFS Write: 1568 SUCCESS
INFO : Total MapReduce CPU Time Spent: 19 seconds 660 msec
INFO : Completed executing command(queryId=hive_20170721155151_da599ee2-d378-46ac-a2b3-a9dfd9824850); Time taken: 70.153 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask (state=08S01,code=1)
Closing: 0: jdbc:hive2://ip-172-31-4-192.ap-south-1.compute.internal:10000/default
Intercepting System.exit(2)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.Hive2Main], exit code [2]
... View more
Labels:
- Labels:
-
Apache Hive
07-19-2017
05:12 AM
@gvenkatesh: Here http://www.yourtechchick.com/hadoop/hive/step-step-guide-sqoop-incremental-imports/ http://www.yourtechchick.com/sqoop/run-sqoop-jobs-from-oozie/
... View more
07-12-2017
06:05 AM
@gvenkatesh: http://www.yourtechchick.com/sqoop/run-sqoop-jobs-from-oozie/ Here
... View more
07-04-2017
12:11 PM
I did. Nothing changed
... View more