Support Questions

Find answers, ask questions, and share your expertise

Exercise 1 - Faced error on my first trying

avatar
New Contributor

Hello everyone. I have installed the quickstart vm and in "get started" tutorial, on the Exercise 1, when I have to launch the sqoop job:

 

sqoop import-all-tables \
    -m 1 \
    --connect jdbc:mysql://quickstart:3306/retail_db \
    --username=retail_dba \
    --password=cloudera \
    --compression-codec=snappy \
    --as-parquetfile \
    --warehouse-dir=/user/hive/warehouse \
    --hive-import

 

I received the following error. I just copied and pasted the command, I dont know what happened. Can someone help me please?

 

16/06/12 11:17:51 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.5.0
16/06/12 11:17:51 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/06/12 11:17:51 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
16/06/12 11:17:51 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16/06/12 11:17:51 WARN tool.BaseSqoopTool: It seems that you're doing hive import directly into default
16/06/12 11:17:51 WARN tool.BaseSqoopTool: hive warehouse directory which is not supported. Sqoop is
16/06/12 11:17:51 WARN tool.BaseSqoopTool: firstly importing data into separate directory and then
16/06/12 11:17:51 WARN tool.BaseSqoopTool: inserting data into hive. Please consider removing
16/06/12 11:17:51 WARN tool.BaseSqoopTool: --target-dir or --warehouse-dir into /user/hive/warehouse in
16/06/12 11:17:51 WARN tool.BaseSqoopTool: case that you will detect any issues.
16/06/12 11:17:54 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/06/12 11:17:56 INFO tool.CodeGenTool: Beginning code generation
16/06/12 11:17:56 INFO tool.CodeGenTool: Will generate java class as codegen_categories
16/06/12 11:17:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
16/06/12 11:17:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
16/06/12 11:17:56 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-cloudera/compile/bd802e707083c6dbfbf8bb4cc1a87868/codegen_categories.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/06/12 11:18:05 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/bd802e707083c6dbfbf8bb4cc1a87868/codegen_categories.jar
16/06/12 11:18:05 WARN manager.MySQLManager: It looks like you are importing from mysql.
16/06/12 11:18:05 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
16/06/12 11:18:05 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
16/06/12 11:18:05 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
16/06/12 11:18:05 INFO mapreduce.ImportJobBase: Beginning import of categories
16/06/12 11:18:07 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
16/06/12 11:18:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
16/06/12 11:18:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
16/06/12 11:18:18 INFO hive.metastore: Trying to connect to metastore with URI thrift://quickstart.cloudera:9083
16/06/12 11:18:19 WARN hive.metastore: Failed to connect to the MetaStore Server...
16/06/12 11:18:19 INFO hive.metastore: Waiting 1 seconds before next connection attempt.
16/06/12 11:18:20 INFO hive.metastore: Trying to connect to metastore with URI thrift://quickstart.cloudera:9083
16/06/12 11:18:20 WARN hive.metastore: Failed to connect to the MetaStore Server...
16/06/12 11:18:20 INFO hive.metastore: Waiting 1 seconds before next connection attempt.
16/06/12 11:18:21 INFO hive.metastore: Trying to connect to metastore with URI thrift://quickstart.cloudera:9083
16/06/12 11:18:21 WARN hive.metastore: Failed to connect to the MetaStore Server...
16/06/12 11:18:21 INFO hive.metastore: Waiting 1 seconds before next connection attempt.
16/06/12 11:18:22 ERROR sqoop.Sqoop: Got exception running Sqoop: org.kitesdk.data.DatasetOperationException: Hive metastore exception
org.kitesdk.data.DatasetOperationException: Hive metastore exception
    at org.kitesdk.data.spi.hive.MetaStoreUtil.<init>(MetaStoreUtil.java:135)
    at org.kitesdk.data.spi.hive.MetaStoreUtil.get(MetaStoreUtil.java:101)
    at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.getMetaStoreUtil(HiveAbstractMetadataProvider.java:63)
    at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270)
    at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255)
    at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.exists(HiveAbstractMetadataProvider.java:159)
    at org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.exists(FileSystemDatasetRepository.java:262)
    at org.kitesdk.data.Datasets.exists(Datasets.java:629)
    at org.kitesdk.data.Datasets.exists(Datasets.java:646)
    at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:117)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:260)
    at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
    at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
    at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:111)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused
    at org.apache.thrift.transport.TSocket.open(TSocket.java:187)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:419)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:234)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:179)
    at org.kitesdk.data.spi.hive.MetaStoreUtil.<init>(MetaStoreUtil.java:133)
    at org.kitesdk.data.spi.hive.MetaStoreUtil.get(MetaStoreUtil.java:101)
    at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.getMetaStoreUtil(HiveAbstractMetadataProvider.java:63)
    at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270)
    at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255)
    at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.exists(HiveAbstractMetadataProvider.java:159)
    at org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.exists(FileSystemDatasetRepository.java:262)
    at org.kitesdk.data.Datasets.exists(Datasets.java:629)
    at org.kitesdk.data.Datasets.exists(Datasets.java:646)
    at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:117)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:260)
    at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
    at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
    at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:111)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.net.ConnectException: Connection refused
    at java.net.PlainSocketImpl.socketConnect(Native Method)
    at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
    at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
    at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    at java.net.Socket.connect(Socket.java:579)
    at org.apache.thrift.transport.TSocket.open(TSocket.java:182)
    ... 24 more
)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:466)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:234)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:179)
    at org.kitesdk.data.spi.hive.MetaStoreUtil.<init>(MetaStoreUtil.java:133)
    ... 20 more

 

1 ACCEPTED SOLUTION

avatar
Mentor
Ensure your Hive MetaStore service is up and running.

If you use packages:

service hive-metastore status

If you use Cloudera Manager: CM -> Hive -> Instances -> Hive MetaStore Server page

View solution in original post

5 REPLIES 5

avatar
Mentor
Ensure your Hive MetaStore service is up and running.

If you use packages:

service hive-metastore status

If you use Cloudera Manager: CM -> Hive -> Instances -> Hive MetaStore Server page

avatar
New Contributor

Hello Harsh J,

 

It worked! It was simple. Thank you

 

Cheers,

 

avatar
Explorer

I've started the service but still I get the same error.

Any ideas?

 

 

ERROR sqoop.Sqoop: Got exception running Sqoop: org.kitesdk.data.DatasetOperationException: Hive MetaStore exception
org.kitesdk.data.DatasetOperationException: Hive MetaStore exception
    at org.kitesdk.data.spi.hive.MetaStoreUtil.createTable(MetaStoreUtil.java:252)
    at org.kitesdk.data.spi.hive.HiveManagedMetadataProvider.create(HiveManagedMetadataProvider.java:87)
    at org.kitesdk.data.spi.hive.HiveManagedDatasetRepository.create(HiveManagedDatasetRepository.java:81)
    at org.kitesdk.data.Datasets.create(Datasets.java:239)
    at org.kitesdk.data.Datasets.create(Datasets.java:307)
    at org.apache.sqoop.mapreduce.ParquetJob.createDataset(ParquetJob.java:141)
    at org.apache.sqoop.mapreduce.ParquetJob.configureImportJob(ParquetJob.java:119)
    at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:130)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:267)
    at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
    at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)
    at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:111)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException Cannot create directory /user/hive/warehouse/categories. Name node is in safe mode.
The reported blocks 0 needs additional 921 blocks to reach the threshold 0.9990 of total blocks 921.
The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.

avatar
New Contributor

I have the same issue. I keep re-starting the hive metastore service, but whenever I run the sqoop command I get the connection exception and the service goes back into "bad Health" status.

Any ideas?

avatar
New Contributor

Thank you.  I had the same issue.  Should some content be added to the tutorial indicating that these services need to be started?