<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Exercise 1 - Faced error on my first trying in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/48827#M1934</link>
    <description>&lt;P&gt;I've started the service but still I get the same error.&lt;/P&gt;&lt;P&gt;Any ideas?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;ERROR sqoop.Sqoop: Got exception running Sqoop: org.kitesdk.data.DatasetOperationException: Hive MetaStore exception&lt;/STRONG&gt;&lt;BR /&gt;org.kitesdk.data.DatasetOperationException: Hive MetaStore exception&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.MetaStoreUtil.createTable(MetaStoreUtil.java:252)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveManagedMetadataProvider.create(HiveManagedMetadataProvider.java:87)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveManagedDatasetRepository.create(HiveManagedDatasetRepository.java:81)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.Datasets.create(Datasets.java:239)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.Datasets.create(Datasets.java:307)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.ParquetJob.createDataset(ParquetJob.java:141)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.ParquetJob.configureImportJob(ParquetJob.java:119)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:130)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:267)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:111)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.run(Sqoop.java:143)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.main(Sqoop.java:236)&lt;BR /&gt;Caused by: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException Cannot create directory /user/hive/warehouse/categories. Name node is in safe mode.&lt;BR /&gt;The reported blocks 0 needs additional 921 blocks to reach the threshold 0.9990 of total blocks 921.&lt;BR /&gt;The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.&lt;/P&gt;</description>
    <pubDate>Tue, 27 Dec 2016 10:03:25 GMT</pubDate>
    <dc:creator>TomerD123</dc:creator>
    <dc:date>2016-12-27T10:03:25Z</dc:date>
    <item>
      <title>Exercise 1 - Faced error on my first trying</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/41917#M1931</link>
      <description>&lt;P&gt;Hello everyone. I have installed the quickstart vm and in "get started" tutorial, on the Exercise 1, when I have to launch the sqoop job:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;sqoop import-all-tables \
    -m 1 \
    --connect jdbc:mysql://quickstart:3306/retail_db \
    --username=retail_dba \
    --password=cloudera \
    --compression-codec=snappy \
    --as-parquetfile \
    --warehouse-dir=/user/hive/warehouse \
    --hive-import&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I received the following error. I just copied and pasted the command, I dont know what happened. Can someone help me please?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;16/06/12 11:17:51 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.5.0&lt;BR /&gt;16/06/12 11:17:51 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.&lt;BR /&gt;16/06/12 11:17:51 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override&lt;BR /&gt;16/06/12 11:17:51 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.&lt;BR /&gt;16/06/12 11:17:51 WARN tool.BaseSqoopTool: It seems that you're doing hive import directly into default&lt;BR /&gt;16/06/12 11:17:51 WARN tool.BaseSqoopTool: hive warehouse directory which is not supported. Sqoop is&lt;BR /&gt;16/06/12 11:17:51 WARN tool.BaseSqoopTool: firstly importing data into separate directory and then&lt;BR /&gt;16/06/12 11:17:51 WARN tool.BaseSqoopTool: inserting data into hive. Please consider removing&lt;BR /&gt;16/06/12 11:17:51 WARN tool.BaseSqoopTool: --target-dir or --warehouse-dir into /user/hive/warehouse in&lt;BR /&gt;16/06/12 11:17:51 WARN tool.BaseSqoopTool: case that you will detect any issues.&lt;BR /&gt;16/06/12 11:17:54 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.&lt;BR /&gt;16/06/12 11:17:56 INFO tool.CodeGenTool: Beginning code generation&lt;BR /&gt;16/06/12 11:17:56 INFO tool.CodeGenTool: Will generate java class as codegen_categories&lt;BR /&gt;16/06/12 11:17:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;16/06/12 11:17:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;16/06/12 11:17:56 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce&lt;BR /&gt;Note: /tmp/sqoop-cloudera/compile/bd802e707083c6dbfbf8bb4cc1a87868/codegen_categories.java uses or overrides a deprecated API.&lt;BR /&gt;Note: Recompile with -Xlint:deprecation for details.&lt;BR /&gt;16/06/12 11:18:05 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/bd802e707083c6dbfbf8bb4cc1a87868/codegen_categories.jar&lt;BR /&gt;16/06/12 11:18:05 WARN manager.MySQLManager: It looks like you are importing from mysql.&lt;BR /&gt;16/06/12 11:18:05 WARN manager.MySQLManager: This transfer can be faster! Use the --direct&lt;BR /&gt;16/06/12 11:18:05 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.&lt;BR /&gt;16/06/12 11:18:05 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)&lt;BR /&gt;16/06/12 11:18:05 INFO mapreduce.ImportJobBase: Beginning import of categories&lt;BR /&gt;16/06/12 11:18:07 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar&lt;BR /&gt;16/06/12 11:18:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;16/06/12 11:18:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;16/06/12 11:18:18 INFO hive.metastore: Trying to connect to metastore with URI thrift://quickstart.cloudera:9083&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;16/06/12 11:18:19 WARN hive.metastore: Failed to connect to the MetaStore Server...&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;16/06/12 11:18:19 INFO hive.metastore: Waiting 1 seconds before next connection attempt.&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;16/06/12 11:18:20 INFO hive.metastore: Trying to connect to metastore with URI thrift://quickstart.cloudera:9083&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;16/06/12 11:18:20 WARN hive.metastore: Failed to connect to the MetaStore Server...&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;16/06/12 11:18:20 INFO hive.metastore: Waiting 1 seconds before next connection attempt.&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;16/06/12 11:18:21 INFO hive.metastore: Trying to connect to metastore with URI thrift://quickstart.cloudera:9083&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;16/06/12 11:18:21 WARN hive.metastore: Failed to connect to the MetaStore Server...&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;16/06/12 11:18:21 INFO hive.metastore: Waiting 1 seconds before next connection attempt.&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;16/06/12 11:18:22 ERROR sqoop.Sqoop: Got exception running Sqoop: org.kitesdk.data.DatasetOperationException: Hive metastore exception&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;org.kitesdk.data.DatasetOperationException: Hive metastore exception&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.MetaStoreUtil.&amp;lt;init&amp;gt;(MetaStoreUtil.java:135)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.MetaStoreUtil.get(MetaStoreUtil.java:101)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.getMetaStoreUtil(HiveAbstractMetadataProvider.java:63)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.exists(HiveAbstractMetadataProvider.java:159)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.exists(FileSystemDatasetRepository.java:262)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.Datasets.exists(Datasets.java:629)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.Datasets.exists(Datasets.java:646)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:117)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:260)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:111)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.run(Sqoop.java:143)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.main(Sqoop.java:236)&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&lt;U&gt;&lt;STRONG&gt;Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused&lt;/STRONG&gt;&lt;/U&gt;&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.thrift.transport.TSocket.open(TSocket.java:187)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:419)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:234)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:179)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.MetaStoreUtil.&amp;lt;init&amp;gt;(MetaStoreUtil.java:133)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.MetaStoreUtil.get(MetaStoreUtil.java:101)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.getMetaStoreUtil(HiveAbstractMetadataProvider.java:63)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.exists(HiveAbstractMetadataProvider.java:159)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.exists(FileSystemDatasetRepository.java:262)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.Datasets.exists(Datasets.java:629)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.Datasets.exists(Datasets.java:646)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:117)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:260)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:111)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.run(Sqoop.java:143)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.main(Sqoop.java:236)&lt;BR /&gt;Caused by: java.net.ConnectException: Connection refused&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.net.PlainSocketImpl.socketConnect(Native Method)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.net.Socket.connect(Socket.java:579)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.thrift.transport.TSocket.open(TSocket.java:182)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;... 24 more&lt;BR /&gt;)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:466)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:234)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:179)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.MetaStoreUtil.&amp;lt;init&amp;gt;(MetaStoreUtil.java:133)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;... 20 more&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 10:24:39 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/41917#M1931</guid>
      <dc:creator>madasi</dc:creator>
      <dc:date>2022-09-16T10:24:39Z</dc:date>
    </item>
    <item>
      <title>Re: Exercise 1 - Faced error on my first trying</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/41929#M1932</link>
      <description>Ensure your Hive MetaStore service is up and running.&lt;BR /&gt;&lt;BR /&gt;If you use packages:&lt;BR /&gt;&lt;BR /&gt;service hive-metastore status&lt;BR /&gt;&lt;BR /&gt;If you use Cloudera Manager: CM -&amp;gt; Hive -&amp;gt; Instances -&amp;gt; Hive MetaStore Server page</description>
      <pubDate>Mon, 13 Jun 2016 07:56:30 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/41929#M1932</guid>
      <dc:creator>Harsh J</dc:creator>
      <dc:date>2016-06-13T07:56:30Z</dc:date>
    </item>
    <item>
      <title>Re: Exercise 1 - Faced error on my first trying</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/41939#M1933</link>
      <description>&lt;P&gt;Hello Harsh J,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;It worked! It was simple. Thank you&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Cheers,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jun 2016 10:16:40 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/41939#M1933</guid>
      <dc:creator>madasi</dc:creator>
      <dc:date>2016-06-13T10:16:40Z</dc:date>
    </item>
    <item>
      <title>Re: Exercise 1 - Faced error on my first trying</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/48827#M1934</link>
      <description>&lt;P&gt;I've started the service but still I get the same error.&lt;/P&gt;&lt;P&gt;Any ideas?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;ERROR sqoop.Sqoop: Got exception running Sqoop: org.kitesdk.data.DatasetOperationException: Hive MetaStore exception&lt;/STRONG&gt;&lt;BR /&gt;org.kitesdk.data.DatasetOperationException: Hive MetaStore exception&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.MetaStoreUtil.createTable(MetaStoreUtil.java:252)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveManagedMetadataProvider.create(HiveManagedMetadataProvider.java:87)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.spi.hive.HiveManagedDatasetRepository.create(HiveManagedDatasetRepository.java:81)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.Datasets.create(Datasets.java:239)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.kitesdk.data.Datasets.create(Datasets.java:307)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.ParquetJob.createDataset(ParquetJob.java:141)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.ParquetJob.configureImportJob(ParquetJob.java:119)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:130)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:267)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:111)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.run(Sqoop.java:143)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.sqoop.Sqoop.main(Sqoop.java:236)&lt;BR /&gt;Caused by: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException Cannot create directory /user/hive/warehouse/categories. Name node is in safe mode.&lt;BR /&gt;The reported blocks 0 needs additional 921 blocks to reach the threshold 0.9990 of total blocks 921.&lt;BR /&gt;The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.&lt;/P&gt;</description>
      <pubDate>Tue, 27 Dec 2016 10:03:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/48827#M1934</guid>
      <dc:creator>TomerD123</dc:creator>
      <dc:date>2016-12-27T10:03:25Z</dc:date>
    </item>
    <item>
      <title>Re: Exercise 1 - Faced error on my first trying</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/53388#M1935</link>
      <description>&lt;P&gt;I have the same issue. I keep re-starting the hive metastore service, but whenever I run the sqoop command I get the connection exception and the service goes back into "bad Health" status.&lt;/P&gt;&lt;P&gt;Any ideas?&lt;/P&gt;</description>
      <pubDate>Sun, 09 Apr 2017 13:56:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/53388#M1935</guid>
      <dc:creator>gbandeira</dc:creator>
      <dc:date>2017-04-09T13:56:46Z</dc:date>
    </item>
    <item>
      <title>Re: Exercise 1 - Faced error on my first trying</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/88840#M1936</link>
      <description>&lt;P&gt;Thank you.&amp;nbsp; I had the same issue.&amp;nbsp; Should some content be added to the tutorial indicating that these services need to be started?&amp;nbsp;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 08 Apr 2019 17:15:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exercise-1-Faced-error-on-my-first-trying/m-p/88840#M1936</guid>
      <dc:creator>AlbertAlbert</dc:creator>
      <dc:date>2019-04-08T17:15:06Z</dc:date>
    </item>
  </channel>
</rss>

