<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Troubles with Tutorial Exercise 1 [Metastore connection issue] in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Troubles-with-Tutorial-Exercise-1-Metastore-connection-issue/m-p/279601#M208352</link>
    <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/70015"&gt;@Alieer&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Did you upgrade CM/CDH on your cluster recently?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;From the error snippet below:&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'A0.OWNER_TYPE' in 'field list'&lt;/LI-CODE&gt;
&lt;P&gt;&lt;SPAN&gt;It looks like the Hive&amp;nbsp;Hive Metastore Database Schema may not be upgraded correctly.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Please try these steps:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;1. From Cloudera Manager, stop Hive service&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;2. On the UI, Clusters -&amp;gt; Hive -&amp;gt; Actions -&amp;gt; Upgrade Hive Metastore Database Schema&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;3. Start Hive service&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;4. Try your test again&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Thanks and hope this helps,&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Li&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Mon, 07 Oct 2019 18:17:30 GMT</pubDate>
    <dc:creator>lwang</dc:creator>
    <dc:date>2019-10-07T18:17:30Z</dc:date>
    <item>
      <title>Troubles with Tutorial Exercise 1 [Metastore connection issue]</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Troubles-with-Tutorial-Exercise-1-Metastore-connection-issue/m-p/278819#M208256</link>
      <description>&lt;P&gt;Hi Cloudera Community!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I`m trying to start going through the tutorial but cannot overcome the following problem:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;[cloudera@quickstart java]$ sqoop import-all-tables -m 1 --connect jdbc:mysql://quickstart:3306/retail_db --username=retail_dba --password=cloudera --compression-codec=snappy --as-parquetfile --warehouse-dir=/user/hive/warehouse --hive-import&lt;BR /&gt;19/10/04 01:46:39 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.16.2&lt;BR /&gt;19/10/04 01:46:39 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.&lt;BR /&gt;19/10/04 01:46:39 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override&lt;BR /&gt;19/10/04 01:46:39 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.&lt;BR /&gt;19/10/04 01:46:39 WARN tool.BaseSqoopTool: It seems that you're doing hive import directly into default&lt;BR /&gt;19/10/04 01:46:39 WARN tool.BaseSqoopTool: hive warehouse directory which is not supported. Sqoop is&lt;BR /&gt;19/10/04 01:46:39 WARN tool.BaseSqoopTool: firstly importing data into separate directory and then&lt;BR /&gt;19/10/04 01:46:39 WARN tool.BaseSqoopTool: inserting data into hive. Please consider removing&lt;BR /&gt;19/10/04 01:46:39 WARN tool.BaseSqoopTool: --target-dir or --warehouse-dir into /user/hive/warehouse in&lt;BR /&gt;19/10/04 01:46:39 WARN tool.BaseSqoopTool: case that you will detect any issues.&lt;BR /&gt;19/10/04 01:46:39 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.&lt;BR /&gt;19/10/04 01:46:40 INFO tool.CodeGenTool: Beginning code generation&lt;BR /&gt;19/10/04 01:46:40 INFO tool.CodeGenTool: Will generate java class as codegen_categories&lt;BR /&gt;19/10/04 01:46:40 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;19/10/04 01:46:40 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;19/10/04 01:46:40 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce&lt;BR /&gt;Note: /tmp/sqoop-cloudera/compile/79ae3f67fb6eadb1c64cd69baf6f38c8/codegen_categories.java uses or overrides a deprecated API.&lt;BR /&gt;Note: Recompile with -Xlint:deprecation for details.&lt;BR /&gt;19/10/04 01:46:43 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/79ae3f67fb6eadb1c64cd69baf6f38c8/codegen_categories.jar&lt;BR /&gt;19/10/04 01:46:43 WARN manager.MySQLManager: It looks like you are importing from mysql.&lt;BR /&gt;19/10/04 01:46:43 WARN manager.MySQLManager: This transfer can be faster! Use the --direct&lt;BR /&gt;19/10/04 01:46:43 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.&lt;BR /&gt;19/10/04 01:46:43 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)&lt;BR /&gt;19/10/04 01:46:43 INFO mapreduce.ImportJobBase: Beginning import of categories&lt;BR /&gt;19/10/04 01:46:43 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address&lt;BR /&gt;19/10/04 01:46:43 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar&lt;BR /&gt;19/10/04 01:46:45 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;19/10/04 01:46:45 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;19/10/04 01:46:46 INFO hive.metastore: Trying to connect to metastore with URI thrift://127.0.0.1:9083&lt;BR /&gt;19/10/04 01:46:46 INFO hive.metastore: Opened a connection to metastore, current connections: 1&lt;BR /&gt;19/10/04 01:46:46 INFO hive.metastore: Connected to metastore.&lt;BR /&gt;19/10/04 01:47:06 INFO hive.metastore: Closed a connection to metastore, current connections: 0&lt;BR /&gt;19/10/04 01:47:06 INFO hive.metastore: Trying to connect to metastore with URI thrift://127.0.0.1:9083&lt;BR /&gt;19/10/04 01:47:06 INFO hive.metastore: Opened a connection to metastore, current connections: 1&lt;BR /&gt;19/10/04 01:47:06 INFO hive.metastore: Connected to metastore.&lt;BR /&gt;19/10/04 01:47:26 ERROR sqoop.Sqoop: Got exception running Sqoop: org.kitesdk.data.DatasetOperationException: Hive MetaStore exception&lt;BR /&gt;org.kitesdk.data.DatasetOperationException: Hive MetaStore exception&lt;BR /&gt;at org.kitesdk.data.spi.hive.MetaStoreUtil.tableExists(MetaStoreUtil.java:190)&lt;BR /&gt;at org.kitesdk.data.spi.hive.MetaStoreUtil.exists(MetaStoreUtil.java:396)&lt;BR /&gt;at&lt;/P&gt;
&lt;LI-SPOILER&gt;org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270)&lt;BR /&gt;at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255)&lt;BR /&gt;at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.exists(HiveAbstractMetadataProvider.java:159)&lt;BR /&gt;at org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.exists(FileSystemDatasetRepository.java:262)&lt;BR /&gt;at org.kitesdk.data.Datasets.exists(Datasets.java:629)&lt;BR /&gt;at org.kitesdk.data.Datasets.exists(Datasets.java:646)&lt;BR /&gt;at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:117)&lt;BR /&gt;at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:267)&lt;BR /&gt;at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:691)&lt;BR /&gt;at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)&lt;BR /&gt;at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:513)&lt;BR /&gt;at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:105)&lt;BR /&gt;at org.apache.sqoop.Sqoop.run(Sqoop.java:147)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)&lt;BR /&gt;at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)&lt;BR /&gt;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)&lt;BR /&gt;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)&lt;BR /&gt;at org.apache.sqoop.Sqoop.main(Sqoop.java:252)&lt;BR /&gt;Caused by: MetaException(message:Exception thrown when executing query)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:37244)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:37221)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:37152)&lt;BR /&gt;at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1294)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1280)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.tableExists(HiveMetaStoreClient.java:1359)&lt;BR /&gt;at org.kitesdk.data.spi.hive.MetaStoreUtil$2.call(MetaStoreUtil.java:181)&lt;BR /&gt;at org.kitesdk.data.spi.hive.MetaStoreUtil$2.call(MetaStoreUtil.java:178)&lt;BR /&gt;at org.kitesdk.data.spi.hive.MetaStoreUtil.doWithRetry(MetaStoreUtil.java:70)&lt;BR /&gt;at org.kitesdk.data.spi.hive.MetaStoreUtil.tableExists(MetaStoreUtil.java:186)&lt;BR /&gt;... 19 more&lt;/LI-SPOILER&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;And in /var/log/hive/hive-metastore.log :&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;2019-10-04 01:47:24,583 ERROR [pool-4-thread-10]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invokeInternal(207)) - Retrying HMSHandler after 2000 ms (attempt 10 of 10) with error: javax.jdo.JDOException: Exception thrown when executing query&lt;BR /&gt;at&lt;/P&gt;
&lt;LI-SPOILER&gt;
&lt;P&gt;org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:275)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:1217)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:1024)&lt;BR /&gt;at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)&lt;BR /&gt;at com.sun.proxy.$Proxy8.getTable(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:1950)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1905)&lt;BR /&gt;at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:140)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:99)&lt;BR /&gt;at com.sun.proxy.$Proxy10.get_table(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10128)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10112)&lt;BR /&gt;at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:415)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)&lt;BR /&gt;at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)&lt;BR /&gt;at java.lang.Thread.run(Thread.java:745)&lt;BR /&gt;NestedThrowablesStackTrace:&lt;BR /&gt;com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'A0.OWNER_TYPE' in 'field list'&lt;BR /&gt;at sun.reflect.GeneratedConstructorAccessor16.newInstance(Unknown Source)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at com.mysql.jdbc.Util.handleNewInstance(Util.java:377)&lt;BR /&gt;at com.mysql.jdbc.Util.getInstance(Util.java:360)&lt;BR /&gt;at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:978)&lt;BR /&gt;at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887)&lt;BR /&gt;at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823)&lt;BR /&gt;at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2435)&lt;BR /&gt;at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582)&lt;BR /&gt;at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2530)&lt;BR /&gt;at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1907)&lt;BR /&gt;at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2030)&lt;BR /&gt;at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:174)&lt;BR /&gt;at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)&lt;BR /&gt;at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)&lt;BR /&gt;at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:651)&lt;BR /&gt;at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)&lt;BR /&gt;at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:266)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:1217)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:1024)&lt;BR /&gt;at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)&lt;BR /&gt;at com.sun.proxy.$Proxy8.getTable(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:1950)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1905)&lt;BR /&gt;at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:140)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:99)&lt;BR /&gt;at com.sun.proxy.$Proxy10.get_table(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10128)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10112)&lt;BR /&gt;at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:415)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)&lt;BR /&gt;at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)&lt;BR /&gt;at java.lang.Thread.run(Thread.java:745)&lt;/P&gt;
&lt;P&gt;2019-10-04 01:47:26,585 ERROR [pool-4-thread-10]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invokeInternal(199)) - HMSHandler Fatal error: javax.jdo.JDOException: Exception thrown when executing query&lt;BR /&gt;at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:275)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:1217)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:1024)&lt;BR /&gt;at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)&lt;BR /&gt;at com.sun.proxy.$Proxy8.getTable(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:1950)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1905)&lt;BR /&gt;at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:140)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:99)&lt;BR /&gt;at com.sun.proxy.$Proxy10.get_table(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10128)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10112)&lt;BR /&gt;at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:415)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)&lt;BR /&gt;at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)&lt;BR /&gt;at java.lang.Thread.run(Thread.java:745)&lt;BR /&gt;NestedThrowablesStackTrace:&lt;BR /&gt;com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'A0.OWNER_TYPE' in 'field list'&lt;BR /&gt;at sun.reflect.GeneratedConstructorAccessor16.newInstance(Unknown Source)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at com.mysql.jdbc.Util.handleNewInstance(Util.java:377)&lt;BR /&gt;at com.mysql.jdbc.Util.getInstance(Util.java:360)&lt;BR /&gt;at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:978)&lt;BR /&gt;at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887)&lt;BR /&gt;at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823)&lt;BR /&gt;at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2435)&lt;BR /&gt;at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582)&lt;BR /&gt;at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2530)&lt;BR /&gt;at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1907)&lt;BR /&gt;at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2030)&lt;BR /&gt;at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:174)&lt;BR /&gt;at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)&lt;BR /&gt;at org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)&lt;BR /&gt;at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:651)&lt;BR /&gt;at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)&lt;BR /&gt;at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:266)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:1217)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:1024)&lt;BR /&gt;at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)&lt;BR /&gt;at com.sun.proxy.$Proxy8.getTable(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:1950)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1905)&lt;BR /&gt;at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:140)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:99)&lt;BR /&gt;at com.sun.proxy.$Proxy10.get_table(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10128)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$get_table.getResult(ThriftHiveMetastore.java:10112)&lt;BR /&gt;at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:415)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)&lt;BR /&gt;at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)&lt;BR /&gt;at java.lang.Thread.run(Thread.java:745)&lt;/P&gt;
&lt;/LI-SPOILER&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Please, help me! What is the problem and how i could fix it?&lt;/P&gt;</description>
      <pubDate>Fri, 04 Oct 2019 14:25:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Troubles-with-Tutorial-Exercise-1-Metastore-connection-issue/m-p/278819#M208256</guid>
      <dc:creator>Alieer</dc:creator>
      <dc:date>2019-10-04T14:25:08Z</dc:date>
    </item>
    <item>
      <title>Re: Troubles with Tutorial Exercise 1 [Metastore connection issue]</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Troubles-with-Tutorial-Exercise-1-Metastore-connection-issue/m-p/278853#M208282</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/70015"&gt;@Alieer&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;It looks likes&amp;nbsp; white&amp;nbsp; space issue between the&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT size="3" color="#FF6600"&gt;$ sqoop import-all-tables&amp;nbsp; \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="3" color="#FF6600"&gt;-m 1&amp;nbsp; \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="3" color="#FF6600"&gt;--connect jdbc:mysql://quickstart:3306/retail_db&amp;nbsp; \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="3" color="#FF6600"&gt;--username=retail_dba&amp;nbsp; \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="3" color="#FF6600"&gt;--password=cloudera&amp;nbsp; \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="3" color="#FF6600"&gt;--compression-codec=snappy&amp;nbsp; \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="3" color="#FF6600"&gt;--as-parquetfile&amp;nbsp; \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="3" color="#FF6600"&gt;--warehouse-dir=/user/hive/warehouse&amp;nbsp; \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="3" color="#FF6600"&gt;--hive-import&amp;nbsp;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Please try to copy and paste the above&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 04 Oct 2019 19:54:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Troubles-with-Tutorial-Exercise-1-Metastore-connection-issue/m-p/278853#M208282</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2019-10-04T19:54:56Z</dc:date>
    </item>
    <item>
      <title>Re: Troubles with Tutorial Exercise 1 [Metastore connection issue]</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Troubles-with-Tutorial-Exercise-1-Metastore-connection-issue/m-p/279361#M208336</link>
      <description>&lt;P&gt;Thank you! But that did not help. Still the&amp;nbsp;same error.&lt;/P&gt;</description>
      <pubDate>Mon, 07 Oct 2019 07:57:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Troubles-with-Tutorial-Exercise-1-Metastore-connection-issue/m-p/279361#M208336</guid>
      <dc:creator>Alieer</dc:creator>
      <dc:date>2019-10-07T07:57:46Z</dc:date>
    </item>
    <item>
      <title>Re: Troubles with Tutorial Exercise 1 [Metastore connection issue]</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Troubles-with-Tutorial-Exercise-1-Metastore-connection-issue/m-p/279601#M208352</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/70015"&gt;@Alieer&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Did you upgrade CM/CDH on your cluster recently?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;From the error snippet below:&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'A0.OWNER_TYPE' in 'field list'&lt;/LI-CODE&gt;
&lt;P&gt;&lt;SPAN&gt;It looks like the Hive&amp;nbsp;Hive Metastore Database Schema may not be upgraded correctly.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Please try these steps:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;1. From Cloudera Manager, stop Hive service&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;2. On the UI, Clusters -&amp;gt; Hive -&amp;gt; Actions -&amp;gt; Upgrade Hive Metastore Database Schema&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;3. Start Hive service&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;4. Try your test again&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Thanks and hope this helps,&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Li&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 07 Oct 2019 18:17:30 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Troubles-with-Tutorial-Exercise-1-Metastore-connection-issue/m-p/279601#M208352</guid>
      <dc:creator>lwang</dc:creator>
      <dc:date>2019-10-07T18:17:30Z</dc:date>
    </item>
    <item>
      <title>Re: Troubles with Tutorial Exercise 1 [Metastore connection issue]</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Troubles-with-Tutorial-Exercise-1-Metastore-connection-issue/m-p/279636#M208376</link>
      <description>&lt;P&gt;Thank you very much! Your answer helped me find a way to solve the problem. I did not have Cloudera Manager running due to a lack of resources allocated to the virtual machine. After starting, I began to check warnings and configuration errors, and problems began to disappear one by one.&lt;/P&gt;&lt;P&gt;Now I understand how to identify problems and solve them (this is not in the tutorial). Thanks again!&lt;/P&gt;</description>
      <pubDate>Tue, 08 Oct 2019 07:35:33 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Troubles-with-Tutorial-Exercise-1-Metastore-connection-issue/m-p/279636#M208376</guid>
      <dc:creator>Alieer</dc:creator>
      <dc:date>2019-10-08T07:35:33Z</dc:date>
    </item>
  </channel>
</rss>

