Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Sqoop Query returns Backend I/O Exception

avatar
Explorer

Hi, I have tried importing a table and import from query and both fail at this step where it gives an I/O error. Does anyone know why this is? Image attached.

5521-gfba2.png

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Hi @ghost k,

Step1:

#Can you edit pg_hba.conf file in /var/lib/pgsql/data

# Add the following line as the first line of pg_hba.conf. It allows access to all databases for all users with an encrypted password:

host all all 0.0.0.0/0 md5

On postgres end:

  1. su postgres
  2. psql
  3. \c ambari
  4. #list all tables
  5. \dt ambari.*
  6. select*from ambari.hosts;

Step2:

Download the jar and

curl -L 'http://jdbc.postgresql.org/download/postgresql-9.2-1002.jdbc4.jar'-o postgresql-9.2-1002.jdbc4.jar

5671-screen-shot-2016-07-08-at-125003-pm.png

Step3:

sqoop list-tables --connect jdbc:postgresql://127.0.0.1/ambari --username ambari --password bigdata

5672-screen-shot-2016-07-08-at-125448-pm.png

Step4: sqoop ambari hosts table into hdfs: password: bigdata

sqoop import --connect jdbc:postgresql://127.0.0.1/ambari --username ambari -P --table hosts --target-dir /user/guest/ambari_hosts_table

5673-screen-shot-2016-07-08-at-125843-pm.png

Hope this helps...

Thanks,

Sujitha

View solution in original post

9 REPLIES 9

avatar
Expert Contributor

Hi, Please check if the sqoop list tables option is working or not against the same connection details.

If this is not working, then there is a connectivity issue from the sqoop client machine to the database server.

The query that is failing here is not the query used by Sqoop to extract data , but the query used to collect metadata information from remote database. It then uses this information to create the hadoop writers. So this appears to be a connectivity issue, and this can be confirmed by running the list tables variant of sqoop first.

avatar
Super Collaborator

I echo Bala, this error is occurring due to connectivity issue with remote database.

Thanks,

Sujitha

avatar
Explorer

Hi,

Thanks for the feedback. list-tables work but doesn't actually give any output. The following are the only logs I get for this. Should I be investigating connectivity?

Please set $ACCUMULO_HOME to the root of your Accumulo installation. 16/07/05 12:55:16 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.2.0-258 16/07/05 12:55:16 INFO manager.SqlManager: Using default fetchSize of 1000

avatar
Super Collaborator

Hi @ghost k

Can you please share the queries you used for importing the tables,creating the tables and also can you share the complete logs from:

/var/log/sqoop

when you use sqoop list-tables if ran successfully it should show the list of tables in the database server in this case its postgre.

Thanks,

Sujitha

avatar
Explorer

Hi @sujitha sanku,

Creating the tables - this is an already existing database so I didn't have to create tables. I didn't create any tables on hadoop - I assumed this would be done automatically? I seem to have no log files within the sqoop directory. Is there something I need to do to enable logs?

The query was :

sqoop list-tables --connect 'jdbc:postgresql://xx.xx.xxx.xx:yyyyy/dbname'

Do let me know if you have any ideas.

Thank You, Kaushy

avatar
Explorer

avatar
Explorer

avatar
Super Collaborator

Hi @ghost k,

Step1:

#Can you edit pg_hba.conf file in /var/lib/pgsql/data

# Add the following line as the first line of pg_hba.conf. It allows access to all databases for all users with an encrypted password:

host all all 0.0.0.0/0 md5

On postgres end:

  1. su postgres
  2. psql
  3. \c ambari
  4. #list all tables
  5. \dt ambari.*
  6. select*from ambari.hosts;

Step2:

Download the jar and

curl -L 'http://jdbc.postgresql.org/download/postgresql-9.2-1002.jdbc4.jar'-o postgresql-9.2-1002.jdbc4.jar

5671-screen-shot-2016-07-08-at-125003-pm.png

Step3:

sqoop list-tables --connect jdbc:postgresql://127.0.0.1/ambari --username ambari --password bigdata

5672-screen-shot-2016-07-08-at-125448-pm.png

Step4: sqoop ambari hosts table into hdfs: password: bigdata

sqoop import --connect jdbc:postgresql://127.0.0.1/ambari --username ambari -P --table hosts --target-dir /user/guest/ambari_hosts_table

5673-screen-shot-2016-07-08-at-125843-pm.png

Hope this helps...

Thanks,

Sujitha

avatar
Super Collaborator

Hi @ghost k,

If this resolved your problem can you please vote the best answer.

Thanks,

Sujitha