Support Questions

Find answers, ask questions, and share your expertise

Error when i try to import a table from postgres to hive

When i try to execute a shell job with sqoop or sqoop job in workflow manager i have a error:

I am try to import tables from postgres to hive

2017-11-06 13:58:46,622  WARN ShellActionExecutor:523 - SERVER[ambari.4linux.com.br] USER[admin] GROUP[-] TOKEN[] APP[Fluxo_BigDatinha] JOB[0000003-171106133302301-oozie-oozi-W] ACTION[0000003-171106133302301-oozie-oozi-W@Verifica_Sujeira_HDFS] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.ShellMain], exit code [1]

My Shell Script:

sqoop import --connect jdbc:postgresql://10.0.0.78:5433/cr4prd --username ambari --password xxxxxxx --table tb_con_contatos --target-dir /user/admin/teste2

My Cluster Configuration:

42475-screenshot-2017-11-6-ambari-bigdatinha.png

42475-screenshot-2017-11-6-ambari-bigdatinha.png

I am running in /user/admin/sqoop and inside this directory have a paste with lib:

42476-screenshot-2017-11-6-ambari-bigdatinha1.png

1 ACCEPTED SOLUTION

Super Mentor

@Guilherme Colla

There are not much logging in the provided WARN message to identify the root cause of the issue.

But few things we can check like the Postgres JDBC Driver is jar installed in Oozie sharelib or not? And Sqoop client is installed on all the nodes properly.

View solution in original post

17 REPLIES 17

Super Mentor

@Guilherme Colla

There are not much logging in the provided WARN message to identify the root cause of the issue.

But few things we can check like the Postgres JDBC Driver is jar installed in Oozie sharelib or not? And Sqoop client is installed on all the nodes properly.

@Jay Kumar SenSharma

my sharelib is:

oozie admin -oozie http://ambari.4linux.com.br:11000/oozie -shareliblist[Available ShareLib]
hive
distcp
mapreduce-streaming
spark
oozie
hcatalog
hive2
sqoop
pig
spark_orig<br>

but i have the postgres.jar inside sqoop folder

42477-screenshot-2017-11-6-ambari-bigdatinha2.png

When i try to execute anyone command with user oozie i have thist error:

ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: org.postgresql.Driver
java.lang.RuntimeException: Could not load db driver class: org.postgresql.Driver
    at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:875)
    at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
    at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
    at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
    at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
    at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
    at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
    at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:328)
    at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1853)
    at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1653)
    at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:243)

But when i execute anyone command with root, the command finish with success.

Super Mentor

@Guilherme Colla

As mentioned earlier the symptoms indicates that you have not placed the Postgres SQL driver inside your OOzie sharedlib.

can you please try that. May be you can try putting the Postgresql driver jar inside the "/user/oozie/share/lib/lib*/sqoop" on hdfs.

Then check:

# hdfs fs -ls /user/oozie/share/lib/sqoop/

.

@Jay Kumar SenSharma

I have the postgres jar inside te /user/oozie/share/lib/*/sqoop

hdfs dfs -ls /user/oozie/share/lib/*/sqoop | grep sql
-rw-r--r--   3 root  hdfs       1208 2017-11-01 13:54 /user/oozie/share/lib/lib_20171030154340/sqoop/db.hsqldb.properties
-rw-r--r--   3 root  hdfs       1081 2017-11-01 13:55 /user/oozie/share/lib/lib_20171030154340/sqoop/db.hsqldb.script
-rw-r--r--   3 oozie hdfs     643727 2017-10-30 15:46 /user/oozie/share/lib/lib_20171030154340/sqoop/hsqldb-1.8.0.7.jar
-rw-r--r--   3 root  hdfs     883898 2017-11-01 12:56 /user/oozie/share/lib/lib_20171030154340/sqoop/mysql-connector-java.jar
-rw-r--r--   3 root  hdfs     579785 2017-10-31 15:40 /user/oozie/share/lib/lib_20171030154340/sqoop/postgresql-9.2-1002.jdbc4.jar
-rw-r--r--   3 root  hdfs     446067 2017-11-06 16:01 /user/oozie/share/lib/lib_20171030154340/sqoop/postgresql-jdbc.jar

Super Mentor

@Guilherme Colla

After putting that Jar in the "/user/oozie/share/lib/lib_20171030154340/sqoop/postgresql-9.2-1002.jdbc4.jar" location have you run the "sharelibupdate" command after putting the Jar there?

# oozie admin -oozie http://$HOSTNAME:11000/oozie -sharelibupdate 

.

@Jay Kumar SenSharma

yes, after i put jar i run the sharelibupdate

Super Mentor

@Guilherme Colla

Please check on your HDFS where your "workflow.xml" is present, create "lib" directory there and put the postgresql jdbc driver jar inside that directory (on HDFS).

.

Super Mentor

@Guilherme Colla

Can you also check if your "job.properties" has included the lib?

oozie.action.sharelib.for.sqoop=hive,hcatalog,sqoop

.

@Jay Kumar SenSharma

My "job.properties" hasn't included the lib, i put the lib and the job failed.

😞

@Jay Kumar SenSharma

I create a sqoop job in workflow manager and works, now i try to execute this command and the job failed, you can help me ?

sqoop import --connect jdbc:postgresql://10.0.0.78:5433/cr4prd --username ambari --password xxxxxx --table tb_con_contatos --hive-database cr4prd --hive-import -m1

Super Mentor

@Guilherme Colla

sqoop import --connect jdbc:postgresql://10.0.0.78:5433/cr4prd --username ambari --password xxxxxx --table tb_con_contatos --hive-database cr4prd --hive-import -m1

If your above Standalone Sqoop command is failing with the missing driver error then you will need to place your postgresql jdbc driver inside the "/usr/hdp/current/sqoop-client/lib/"

.

Super Mentor

@Guilherme Colla

I just tried the following command at my end and it seems to have worked with the following command set.

# cp /usr/lib/ambari-server/postgresql-9.3-1101-jdbc4.jar  /usr/hdp/current/sqoop-client/lib/

.

# su -l sqoop -c "sqoop  import --connect jdbc:postgresql://amb25101.example.com:5432/ambari --username ambari --password bigdata --table hosts  --hive-database default --hive-table ambari_hosts_hive --hive-import -m 1 --target-dir /apps/hive/warehouse/ambari_hosts_hive.db/ambari_hosts_hive"

.

Is your Postgressql Jar correct? I mean is it corrupted by any chance?

# grep 'org.postgresql.Driver' /usr/lib/ambari-server/postgresql-9.3-1101-jdbc4.jar

Binary file /usr/lib/ambari-server/postgresql-9.3-1101-jdbc4.jar matche

.

@Jay Kumar SenSharma

All commands works fine, but when i try to execute a sqoop job in workflow manager i get a error]

My Job Configuration:

42479-screenshot-2017-11-6-ambari-bigdatinha3.png

42480-screenshot-2017-11-6-ambari-bigdatinha4.png

42481-screenshot-2017-11-6-ambari-bigdatinha5.png

@Jay Kumar SenSharma

I find this error in log


2017-11-06 18:09:51,792 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: Calling handler for JobFinishedEvent 2017-11-06 18:09:51,792 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl: job_1509998820430_0003Job Transitioned from COMMITTING to SUCCEEDED 2017-11-06 18:09:51,796 INFO [Thread-68] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: We are finishing cleanly so this is the last retry 2017-11-06 18:09:51,796 INFO [Thread-68] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Notify RMCommunicator isAMLastRetry: true 2017-11-06 18:09:51,796 INFO [Thread-68] org.apache.hadoop.mapreduce.v2.app.rm.RMCommunicator: RMCommunicator notified that shouldUnregistered is: true 2017-11-06 18:09:51,796 INFO [Thread-68] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Notify JHEH isAMLastRetry: true 2017-11-06 18:09:51,796 INFO [Thread-68] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: JobHistoryEventHandler notified that forceJobCompletion is true 2017-11-06 18:09:51,796 INFO [Thread-68] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Calling stop for all the services 2017-11-06 18:09:51,810 INFO [Thread-68] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Stopping JobHistoryEventHandler. Size of the outstanding queue size is 0 2017-11-06 18:09:52,093 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copying hdfs://ambari.4linux.com.br:8020/user/admin/.staging/job_1509998820430_0003/job_1509998820430_0003_1.jhist to hdfs://ambari.4linux.com.br:8020/mr-history/tmp/admin/job_1509998820430_0003-1509998963150-admin-oozie%3Alauncher%3AT%3Dsqoop%3AW%3DFluxo_BigDatinh-1509998991790-1-0-SUCCEEDED-default-1509998978047.jhist_tmp 2017-11-06 18:09:52,227 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before Scheduling: PendingReds:0 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:1 CompletedReds:0 ContAlloc:1 ContRel:0 HostLocal:0 RackLocal:0 2017-11-06 18:09:52,235 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received completed container container_e29_1509998820430_0003_01_000002 2017-11-06 18:09:52,235 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:0 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:0 AssignedReds:0 CompletedMaps:1 CompletedReds:0 ContAlloc:1 ContRel:0 HostLocal:0 RackLocal:0 2017-11-06 18:09:52,237 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1509998820430_0003_m_000000_0: Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143

@Jay Kumar SenSharma

In my path "/usr/hdp/current/sqoop-client/lib/" i hasn't the driver postgres, i put inside the path and run the job but the same failed.

@Jay Kumar SenSharma

Thanks for your help.

I solved this problem as follows:

  • Create a sqoop job for extract data and load in HDFS
  • Using a hive job for execute a "load data inpath" for load data inside hive