Member since
06-29-2017
12
Posts
0
Kudos Received
0
Solutions
08-28-2018
06:59 PM
Were you able to resolve this ?
... View more
08-08-2018
08:08 PM
I am facing the same issue were you able to resolve this @Anji Raju
... View more
05-23-2018
07:35 PM
sqoop import --connect "jdbc:jtds:sqlserver://jxx/tmwus;useNTLMv2=true;domain=CROWLEY" --table expedite_audit_tbl --username xx --password xx --incremental lastmodified --check-column updated_dt --hcatalog-database tmwus --hcatalog-table expedite_audit_tbl --create-hcatalog-table --hcatalog-storage-stanza "stored as orc" --last-value '2018-05-22 00:00:00' --split-by ord_hdrnumber -num-mappers 1 I am trying the following command. However, i receive the following error 18/05/23 15:26:12 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [expedite_audit_tbl] AS t WHERE 1=0
18/05/23 15:26:12 ERROR tool.ImportTool: Imported Failed: There is no column found in the target table expedite_audit_tbl. Please ensure that your table name is correct. I know the table exists as I am able to do the following sqoop import --connect
"jdbc:jtds:sqlserver://xx;useNTLMv2=true;domain=CROWLEY"
--query "select * from tmwus.dbo.expedite_audit_tbl WHERE
updated_dt<='2018-05-22 00:00:00' AND \$CONDITIONS" --username
xx--password xx--hcatalog-database tmwus --hcatalog-table
expedite_audit_tbl --create-hcatalog-table --hcatalog-storage-stanza
"stored as orc" --split-by ord_hdrnumber -num-mappers 12
... View more
Labels:
11-14-2017
08:44 PM
I am trying to do a following querey where I need to delete the records that have same 3 columns. However, I can not do the below query. Does anyone have any idea on how I can accomplish it . delete from cargowise.shipmentdetails
where consolid,shipmentid,branch in (select distinct(consolid,shipmentid,branch) from cargowise.shipmentdetails_temp);
... View more
Labels:
10-30-2017
06:28 PM
I have an external table "shipmentprofile_temp" which has the list of keys that needs to be deleted from my transnational table "shipmentprofile" I am executing below query that executes but does not complete. delete from cargowise.shipmentprofile where transcnno in (select transcnno from cargowise.shipmentprofile_temp);
... View more
Labels:
09-22-2017
07:22 PM
We do not have access to the vendor code to see how it is really generated. There are option to generate .xls or .xlsx file. So I believe it should be .xlsx. I do not know what version of excel to they use either. The application is black box
... View more
09-21-2017
06:17 PM
Were you able to see whats the difference in the files ? That file is automatically generated from a vendor app. I need to be able to convert it to csv so I can use it in hive.
... View more
09-21-2017
04:53 PM
I saw this post and one other however neither of them have a resolution. I am using ,xlsx file. Please see the attachedjax-shipment-profile-report-monday-18-september-20.zip sample file.
... View more
09-21-2017
02:05 PM
I am using ConvertExcelToCSVProcessor in NIFI to convert.xlsx file to csv. However I see the processor is throwing the following error. I have atatched the image of my flow along with it too, 87accf18ba] ConvertExcelToCSVProcessor[id=ba4c3f67-dd21-1af9-95a3-1887accf18ba] failed to process session due to org.apache.nifi.processor.exception.FlowFileHandlingException: StandardFlowFileRecord[uuid=9ef1d06a-c2a4-4f7a-826f-ebfc33f3eef0,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1506002021780-183, container=default, section=183], offset=0, length=912278],offset=0,name=4289436879924664,size=912278] transfer relationship not specified: {}
org.apache.nifi.processor.exception.FlowFileHandlingException: StandardFlowFileRecord[uuid=9ef1d06a-c2a4-4f7a-826f-ebfc33f3eef0,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1506002021780-183, container=default, section=183], offset=0, length=912278],offset=0,name=4289436879924664,size=912278] transfer relationship not specified
at org.apache.nifi.controller.repository.StandardProcessSession.checkpoint(StandardProcessSession.java:248)
at org.apache.nifi.controller.repository.StandardProcessSession.commit(StandardProcessSession.java:318)
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:28)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
... View more
Labels:
09-18-2017
01:28 PM
Thanks Matt. that worked. I was curious why it does not work with the "Entire Text" evaluation mode.
... View more
09-14-2017
03:29 PM
I have a line feed LF character in my csv file that I would like to replace with a pipe character |. ASCII value for LF is 10 How can i do that using replace text processor.
... View more
Labels:
09-06-2017
08:27 PM
@Aruna dadi How did you resolve this issue ?
... View more