Created 08-23-2017 04:41 AM
I have configured an Execute Processor with the below :
configuration being :
1.command = /home/hadoop/software/sqoop-1.4.6.bin__hadoop-0.23/bin/sqoop-export
giving the path till sqoop-export file
2.Command Arguments = --connect jdbc:mysql://localhost/test --username root --password root --table test --export-dir /home/hadoop/input.csv
using mysql DB
Sqoop execution begins ,but i see the below error in
nifi logs :SplitAvro : IO Exception : Not a data file
hadoop name node logs :Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:dr
Has anyone tried using sqoop via nifi ?
Created 08-23-2017 08:28 AM
Thanks @Venkata Sudheer Kumar M .You rightly pointed the issue.
Command was executing by default as root user.
Resolution:
I later updated the property under Nifi conf folder in the file bootstrap.conf as below
run.as:hadoop
The command execution was now as hadoop user and it ran successfully.
Sqoop export is invokable via nifi execute processor.
Created 08-23-2017 07:59 AM
It looks like you are running the command with the root user for which it is checking for the home directory availability for this user (root) under HDFS.
The error: Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:dr signifies that it it trying to access the /user directory with the access permission as WRITE for which your user doesn't have access, you can solve this in two ways:
1) run the command with the hdfs user (if you have the permissions)
2) you can get your user created (root in this case) with the required permissions and run the command.
Created 08-23-2017 08:28 AM
Thanks @Venkata Sudheer Kumar M .You rightly pointed the issue.
Command was executing by default as root user.
Resolution:
I later updated the property under Nifi conf folder in the file bootstrap.conf as below
run.as:hadoop
The command execution was now as hadoop user and it ran successfully.
Sqoop export is invokable via nifi execute processor.