Reply
New Contributor
Posts: 4
Registered: ‎12-27-2016

Hive MetaStore exception

Hello everyone. I have installed the quickstart vm and in "get started" tutorial, on the Exercise 1, when I have to launch the sqoop job:

 

sqoop import-all-tables \
-m 1 \
--connect jdbc:mysql://quickstart:3306/retail_db \
--username=retail_dba \
--password=cloudera \
--compression-codec=snappy \
--as-parquetfile \
--warehouse-dir=/user/hive/warehouse \
--hive-import

I received the following error. I just copied and pasted the command, I dont know what happened. Can someone help me please?
I've also started the Hive MetaStore service and confirm its status is OK but still I get the same error.

Any ideas?

 

 

ERROR sqoop.Sqoop: Got exception running Sqoop: org.kitesdk.data.DatasetOperationException: Hive MetaStore exception
org.kitesdk.data.DatasetOperationException: Hive MetaStore exception
at org.kitesdk.data.spi.hive.MetaStoreUtil.createTable(MetaStoreUtil.java:252)
at org.kitesdk.data.spi.hive.HiveManagedMetadataProvider.create(HiveManagedMetadataProvider.java:87)
at org.kitesdk.data.spi.hive.HiveManagedDatasetRepository.create(HiveManagedDatasetRepository.java:81)
at org.kitesdk.data.Datasets.create(Datasets.java:239)
at org.kitesdk.data.Datasets.create(Datasets.java:307)
at org.apache.sqoop.mapreduce.ParquetJob.createDataset(ParquetJob.java:141)
at org.apache.sqoop.mapreduce.ParquetJob.configureImportJob(ParquetJob.java:119)
at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:130)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:267)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)
at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:111)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException Cannot create directory /user/hive/warehouse/categories. Name node is in safe mode.
The reported blocks 0 needs additional 921 blocks to reach the threshold 0.9990 of total blocks 921.
The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.

New Contributor
Posts: 4
Registered: ‎12-27-2016

Re: Hive MetaStore exception

anyone?

New Contributor
Posts: 4
Registered: ‎03-09-2017

Re: Hive MetaStore exception

Hi @TomerD123

 

I am getting same error running the VM in Virtual Box. Did you figure out how to fix it?

 

Regards,

Zeeshan

Former Member
Posts: 0

Re: Hive MetaStore exception

Restart name node and try...

 

sudo service hadoop-hdfs-namenode restart
New Contributor
Posts: 4
Registered: ‎03-09-2017

Re: Hive MetaStore exception

Did not work :-(