Support Questions

Find answers, ask questions, and share your expertise

Troubles with Tutorial Exercise 1 [Metastore connection issue]

avatar
New Contributor

Hi Cloudera Community!

 

I`m trying to start going through the tutorial but cannot overcome the following problem:

 

[cloudera@quickstart java]$ sqoop import-all-tables -m 1 --connect jdbc:mysql://quickstart:3306/retail_db --username=retail_dba --password=cloudera --compression-codec=snappy --as-parquetfile --warehouse-dir=/user/hive/warehouse --hive-import
19/10/04 01:46:39 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.16.2
19/10/04 01:46:39 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/10/04 01:46:39 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
19/10/04 01:46:39 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
19/10/04 01:46:39 WARN tool.BaseSqoopTool: It seems that you're doing hive import directly into default
19/10/04 01:46:39 WARN tool.BaseSqoopTool: hive warehouse directory which is not supported. Sqoop is
19/10/04 01:46:39 WARN tool.BaseSqoopTool: firstly importing data into separate directory and then
19/10/04 01:46:39 WARN tool.BaseSqoopTool: inserting data into hive. Please consider removing
19/10/04 01:46:39 WARN tool.BaseSqoopTool: --target-dir or --warehouse-dir into /user/hive/warehouse in
19/10/04 01:46:39 WARN tool.BaseSqoopTool: case that you will detect any issues.
19/10/04 01:46:39 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/10/04 01:46:40 INFO tool.CodeGenTool: Beginning code generation
19/10/04 01:46:40 INFO tool.CodeGenTool: Will generate java class as codegen_categories
19/10/04 01:46:40 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
19/10/04 01:46:40 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
19/10/04 01:46:40 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-cloudera/compile/79ae3f67fb6eadb1c64cd69baf6f38c8/codegen_categories.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
19/10/04 01:46:43 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/79ae3f67fb6eadb1c64cd69baf6f38c8/codegen_categories.jar
19/10/04 01:46:43 WARN manager.MySQLManager: It looks like you are importing from mysql.
19/10/04 01:46:43 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
19/10/04 01:46:43 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
19/10/04 01:46:43 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
19/10/04 01:46:43 INFO mapreduce.ImportJobBase: Beginning import of categories
19/10/04 01:46:43 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
19/10/04 01:46:43 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
19/10/04 01:46:45 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
19/10/04 01:46:45 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
19/10/04 01:46:46 INFO hive.metastore: Trying to connect to metastore with URI thrift://127.0.0.1:9083
19/10/04 01:46:46 INFO hive.metastore: Opened a connection to metastore, current connections: 1
19/10/04 01:46:46 INFO hive.metastore: Connected to metastore.
19/10/04 01:47:06 INFO hive.metastore: Closed a connection to metastore, current connections: 0
19/10/04 01:47:06 INFO hive.metastore: Trying to connect to metastore with URI thrift://127.0.0.1:9083
19/10/04 01:47:06 INFO hive.metastore: Opened a connection to metastore, current connections: 1
19/10/04 01:47:06 INFO hive.metastore: Connected to metastore.
19/10/04 01:47:26 ERROR sqoop.Sqoop: Got exception running Sqoop: org.kitesdk.data.DatasetOperationException: Hive MetaStore exception
org.kitesdk.data.DatasetOperationException: Hive MetaStore exception
at org.kitesdk.data.spi.hive.MetaStoreUtil.tableExists(MetaStoreUtil.java:190)
at org.kitesdk.data.spi.hive.MetaStoreUtil.exists(MetaStoreUtil.java:396)
at

Spoiler
org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270)
at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255)
at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.exists(HiveAbstractMetadataProvider.java:159)
at org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.exists(FileSystemDatasetRepository.java:262)
at org.kitesdk.data.Datasets.exists(Datasets.java:629)
at org.kitesdk.data.Datasets.exists(Datasets.java:646)
at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:117)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:267)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:691)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:513)
at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:105)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: MetaException(message:Exception thrown when executing query)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:37244)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:37221)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:37152)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1294)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1280)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.tableExists(HiveMetaStoreClient.java:1359)
at org.kitesdk.data.spi.hive.MetaStoreUtil$2.call(MetaStoreUtil.java:181)
at org.kitesdk.data.spi.hive.MetaStoreUtil$2.call(MetaStoreUtil.java:178)
at org.kitesdk.data.spi.hive.MetaStoreUtil.doWithRetry(MetaStoreUtil.java:70)
at org.kitesdk.data.spi.hive.MetaStoreUtil.tableExists(MetaStoreUtil.java:186)
... 19 more

 

And in /var/log/hive/hive-metastore.log :

 

2019-10-04 01:47:24,583 ERROR [pool-4-thread-10]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invokeInternal(207)) - Retrying HMSHandler after 2000 ms (attempt 10 of 10) with error: javax.jdo.JDOException: Exception thrown when executing query
at

Spoiler

org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:275)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:1217)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:1024)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)
at com.sun.proxy.$Proxy8.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:1950)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1905)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:140)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:99)
at com.sun.proxy.$Proxy10.get_table(Unknown Source)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10128)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10112)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
NestedThrowablesStackTrace:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'A0.OWNER_TYPE' in 'field list'
at sun.reflect.GeneratedConstructorAccessor16.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:377)
at com.mysql.jdbc.Util.getInstance(Util.java:360)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:978)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2435)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2530)
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1907)
at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2030)
at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:174)
at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)
at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)
at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:651)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:1217)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:1024)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)
at com.sun.proxy.$Proxy8.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:1950)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1905)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:140)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:99)
at com.sun.proxy.$Proxy10.get_table(Unknown Source)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10128)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10112)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

2019-10-04 01:47:26,585 ERROR [pool-4-thread-10]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invokeInternal(199)) - HMSHandler Fatal error: javax.jdo.JDOException: Exception thrown when executing query
at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:275)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:1217)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:1024)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)
at com.sun.proxy.$Proxy8.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:1950)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1905)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:140)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:99)
at com.sun.proxy.$Proxy10.get_table(Unknown Source)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10128)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10112)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
NestedThrowablesStackTrace:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'A0.OWNER_TYPE' in 'field list'
at sun.reflect.GeneratedConstructorAccessor16.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:377)
at com.mysql.jdbc.Util.getInstance(Util.java:360)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:978)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2435)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2530)
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1907)
at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2030)
at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:174)
at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)
at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)
at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:651)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:1217)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:1024)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)
at com.sun.proxy.$Proxy8.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:1950)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1905)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:140)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:99)
at com.sun.proxy.$Proxy10.get_table(Unknown Source)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10128)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10112)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

 

 

Please, help me! What is the problem and how i could fix it?

1 ACCEPTED SOLUTION

avatar
Guru

@Alieer ,

 

Did you upgrade CM/CDH on your cluster recently?

 

From the error snippet below:

com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'A0.OWNER_TYPE' in 'field list'

It looks like the Hive Hive Metastore Database Schema may not be upgraded correctly. 

 

Please try these steps:

1. From Cloudera Manager, stop Hive service

2. On the UI, Clusters -> Hive -> Actions -> Upgrade Hive Metastore Database Schema

3. Start Hive service

4. Try your test again

 

Thanks and hope this helps,

Li

Li Wang, Technical Solution Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

Learn more about the Cloudera Community:

Terms of Service

Community Guidelines

How to use the forum

View solution in original post

4 REPLIES 4

avatar
Master Mentor

@Alieer 

It looks likes  white  space issue between the 

$ sqoop import-all-tables  \
-m 1  \
--connect jdbc:mysql://quickstart:3306/retail_db  \
--username=retail_dba  \
--password=cloudera  \
--compression-codec=snappy  \
--as-parquetfile  \
--warehouse-dir=/user/hive/warehouse  \
--hive-import 

 

Please try to copy and paste the above 

 

 

 

avatar
New Contributor

Thank you! But that did not help. Still the same error.

avatar
Guru

@Alieer ,

 

Did you upgrade CM/CDH on your cluster recently?

 

From the error snippet below:

com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'A0.OWNER_TYPE' in 'field list'

It looks like the Hive Hive Metastore Database Schema may not be upgraded correctly. 

 

Please try these steps:

1. From Cloudera Manager, stop Hive service

2. On the UI, Clusters -> Hive -> Actions -> Upgrade Hive Metastore Database Schema

3. Start Hive service

4. Try your test again

 

Thanks and hope this helps,

Li

Li Wang, Technical Solution Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

Learn more about the Cloudera Community:

Terms of Service

Community Guidelines

How to use the forum

avatar
New Contributor

Thank you very much! Your answer helped me find a way to solve the problem. I did not have Cloudera Manager running due to a lack of resources allocated to the virtual machine. After starting, I began to check warnings and configuration errors, and problems began to disappear one by one.

Now I understand how to identify problems and solve them (this is not in the tutorial). Thanks again!