<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: create table as select failed, but only when stored as parquet in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/336583#M232310</link>
    <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/95457"&gt;@vladenache&lt;/a&gt;&amp;nbsp;The issue seems to be with field&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;FieldSchema(name:80t_lab.fan_glo (q5), type:tinyint, comment:null)], properties:null)&lt;/LI-CODE&gt;&lt;P&gt;Please check schema of the table for above field and correct it's name. From the attached stack trace it is trying to identify q5 as a data type and failing as it does not exist&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Fri, 18 Feb 2022 08:59:01 GMT</pubDate>
    <dc:creator>tarak271</dc:creator>
    <dc:date>2022-02-18T08:59:01Z</dc:date>
    <item>
      <title>create table as select failed, but only when stored as parquet</title>
      <link>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/334965#M231914</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;A simple create table as select fails for some tables in my original_db and I don't know why.&lt;/P&gt;&lt;P&gt;The query I use is below, and the error HIVE gives out is really long, but the main theme is "Vertex Failed", "Error while processing row", "No enum constant" - as you can see below.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;However, when ran without "STORED AS PARQUET" option, the query runs fine.&lt;/P&gt;&lt;P&gt;Any ideas what I can try to maybe create the tables as compressed parquet?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;beeline -e "SET parquet.compression=SNAPPY;
SET hive.exec.compress.output=true;
SET parquet.output.codec=snappy;
set hive.tez.container.size = 8192;


DROP table if exists DB.table;
CREATE table DB.table
STORED AS parquet

As


SELECT * FROM ORIGINAL_DB.table;"&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;ERROR : Status: Failed
ERROR : Vertex failed, vertexName=Map 1, vertexId=vertex_1640013186854_27946_1_00, diagnostics=[Task failed, taskId=task_1640013186854_27946_1_00_000004, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : attempt_1640013186854_27946_1_00_000004_0:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row

[...]

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: No enum constant org.apache.parquet.schema.OriginalType.q5&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 31 Jan 2022 13:03:48 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/334965#M231914</guid>
      <dc:creator>vladenache</dc:creator>
      <dc:date>2022-01-31T13:03:48Z</dc:date>
    </item>
    <item>
      <title>Re: create table as select failed, but only when stored as parquet</title>
      <link>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/335138#M231931</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/95457"&gt;@vladenache&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;From the pasted stack trace, we could see that enum q5 does not exist. From the code base&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;org.apache.parquet.schema.OriginalType&lt;/LI-CODE&gt;&lt;P&gt;please refer to&amp;nbsp;&lt;A href="https://github.com/apache/parquet-mr/blob/master/parquet-column/src/main/java/org/apache/parquet/schema/OriginalType.java" target="_blank" rel="noopener"&gt;https://github.com/apache/parquet-mr/blob/master/parquet-column/src/main/java/org/apache/parquet/schema/OriginalType.java&lt;/A&gt;&lt;/P&gt;&lt;P&gt;So the problem could be with&amp;nbsp;&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;Incompatible versions of Parquet/Hive in use, please let us know what versions of hive, parquet are in use or Hadoop distribution version also helps&lt;/LI&gt;&lt;LI&gt;Source data format issues, please share schema of source table and sample data would help&lt;/LI&gt;&lt;LI&gt;Please share full stack trace to understand more about code execution path while query is getting executed&lt;/LI&gt;&lt;/OL&gt;</description>
      <pubDate>Tue, 01 Feb 2022 06:20:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/335138#M231931</guid>
      <dc:creator>tarak271</dc:creator>
      <dc:date>2022-02-01T06:20:27Z</dc:date>
    </item>
    <item>
      <title>Re: create table as select failed, but only when stored as parquet</title>
      <link>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/335148#M231932</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/75567"&gt;@tarak271&lt;/a&gt;&amp;nbsp;, thanks for your answer&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Here are some more details&lt;/P&gt;&lt;P&gt;- Hadoop 3.1.1.7.1.7.63-1&lt;/P&gt;&lt;P&gt;[ I can't manage to get to the parquet version currently in use ]&lt;/P&gt;&lt;P&gt;- the table comes initially from a sqoop job, so maybe some invalid data slipped thru. over 100 columns, so difficult to present them here ( can't attach a sample in my response ) - only 3 types : int, string, double&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I thought also it might be an invalid value in one of the columns but because I am not returned a row number along with the error message it is impossible to see where that would be.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 01 Feb 2022 07:25:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/335148#M231932</guid>
      <dc:creator>vladenache</dc:creator>
      <dc:date>2022-02-01T07:25:08Z</dc:date>
    </item>
    <item>
      <title>Re: create table as select failed, but only when stored as parquet</title>
      <link>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/335459#M231949</link>
      <description>&lt;P&gt;HI&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/95457"&gt;@vladenache&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Could you please share next 50 lines after the below line&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: No enum constant org.apache.parquet.schema.OriginalType.q5&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;or full output of beeline console which includes full stack trace&lt;/P&gt;</description>
      <pubDate>Wed, 02 Feb 2022 08:06:07 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/335459#M231949</guid>
      <dc:creator>tarak271</dc:creator>
      <dc:date>2022-02-02T08:06:07Z</dc:date>
    </item>
    <item>
      <title>Re: create table as select failed, but only when stored as parquet</title>
      <link>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/335465#M231951</link>
      <description>&lt;P&gt;It appears several times in the log&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Here are the first lines of the error, which include that exception also:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;ERROR : Vertex failed, vertexName=Map 1, vertexId=vertex_1640013186854_27946_1_00, diagnostics=[Task failed, taskId=task_1640013186854_27946_1_00_000004, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : attempt_1640013186854_27946_1_00_000004_0:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:296)
	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)
	at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)
	at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:75)
	at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:62)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)
	at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:62)
	at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:38)
	at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
	at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
	at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:101)
	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:76)
	at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:437)
	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)
	... 16 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
	at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:573)
	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:92)
	... 19 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: No enum constant org.apache.parquet.schema.OriginalType.q5
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:829)
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:1004)
	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937)
	at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)
	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937)
	at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:128)
	at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:152)
	at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:552)
	... 20 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: No enum constant org.apache.parquet.schema.OriginalType.q5
	at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:282)
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:872)
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:823)
	... 27 more
Caused by: java.lang.IllegalArgumentException: No enum constant org.apache.parquet.schema.OriginalType.q5
	at java.lang.Enum.valueOf(Enum.java:238)
	at org.apache.parquet.schema.OriginalType.valueOf(OriginalType.java:24)
	at org.apache.parquet.schema.MessageTypeParser.addPrimitiveType(MessageTypeParser.java:182)
	at org.apache.parquet.schema.MessageTypeParser.addType(MessageTypeParser.java:113)
	at org.apache.parquet.schema.MessageTypeParser.addGroupTypeFields(MessageTypeParser.java:101)
	at org.apache.parquet.schema.MessageTypeParser.parse(MessageTypeParser.java:94)
	at org.apache.parquet.schema.MessageTypeParser.parseMessageType(MessageTypeParser.java:84)
	at org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriteSupport.getSchema(DataWritableWriteSupport.java:51)
	at org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriteSupport.init(DataWritableWriteSupport.java:57)
	at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:418)
	at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:380)
	at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.&amp;lt;init&amp;gt;(ParquetRecordWriterWrapper.java:70)
	at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getParquerRecordWriterWrapper(MapredParquetOutputFormat.java:137)
	at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getHiveRecordWriter(MapredParquetOutputFormat.java:126)
	at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:294)
	at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:279)
	... 29 more
], TaskAttempt 1 failed, info=[Error: Error while running task ( failure ) : attempt_1640013186854_27946_1_00_000004_1:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:296)
	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)
	at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)
	at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:75)
	at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:62)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898)
	at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:62)
	at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:38)
	at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
	at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
	at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:101)
	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:76)
	at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:437)
	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)
	... 16 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
	at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:573)
	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:92)
	... 19 more&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Also, here is the schema created&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;INFO  : Created Hive schema: Schema(fieldSchemas:[FieldSchema(name:80t_lab.time, type:string, comment:null), FieldSchema(name:80t_lab.trigger, type:int, comment:null), FieldSchema(name:80t_lab.singlstate, type:tinyint, comment:null), FieldSchema(name:80t_lab.temppi_usecfc, type:double, comment:null), FieldSchema(name:80t_lab.humipi_usecfd, type:double, comment:null), FieldSchema(name:80t_lab.locorshtempglo, type:double, comment:null), FieldSchema(name:80t_lab.supply_temp_unfiltered, type:double, comment:null), FieldSchema(name:80t_lab.temp_setpoint, type:double, comment:null), FieldSchema(name:80t_lab.fan_setpoint, type:double, comment:null), FieldSchema(name:80t_lab.temp_local, type:double, comment:null), FieldSchema(name:80t_lab.return_temp, type:double, comment:null), FieldSchema(name:80t_lab.ht_humi, type:double, comment:null), FieldSchema(name:80t_lab.ht_dewpcorr, type:double, comment:null), FieldSchema(name:80t_lab.fc_temp, type:double, comment:null), FieldSchema(name:80t_lab.bara_fan, type:tinyint, comment:null), FieldSchema(name:80t_lab.bara_cool, type:tinyint, comment:null), FieldSchema(name:80t_lab.bara_deh, type:tinyint, comment:null), FieldSchema(name:80t_lab.bara_heate, type:tinyint, comment:null), FieldSchema(name:80t_lab.bara_hum, type:tinyint, comment:null), FieldSchema(name:80t_lab.bara_fc, type:tinyint, comment:null), FieldSchema(name:80t_lab.c1a_onoff, type:tinyint, comment:null), FieldSchema(name:80t_lab.c1b_onoff, type:tinyint, comment:null), FieldSchema(name:80t_lab.c2a_onoff, type:tinyint, comment:null), FieldSchema(name:80t_lab.c2b_onoff, type:tinyint, comment:null), FieldSchema(name:80t_lab.c3a_onoff, type:tinyint, comment:null), FieldSchema(name:80t_lab.c3b_onoff, type:tinyint, comment:null), FieldSchema(name:80t_lab.c4a_onoff, type:tinyint, comment:null), FieldSchema(name:80t_lab.c4b_onoff, type:tinyint, comment:null), FieldSchema(name:80t_lab.manscrramp1, type:tinyint, comment:null), FieldSchema(name:80t_lab.manscrramp2, type:tinyint, comment:null), FieldSchema(name:80t_lab.manscrramp3, type:tinyint, comment:null), FieldSchema(name:80t_lab.manscrramp4, type:tinyint, comment:null), FieldSchema(name:80t_lab.sys_pi_temp, type:double, comment:null), FieldSchema(name:80t_lab.sys_pi_fan, type:double, comment:null), FieldSchema(name:80t_lab.sys_pi_humi, type:double, comment:null), FieldSchema(name:80t_lab.c1_scrolldiff, type:double, comment:null), FieldSchema(name:80t_lab.c2_scrolldiff, type:double, comment:null), FieldSchema(name:80t_lab.c3_scrolldiff, type:double, comment:null), FieldSchema(name:80t_lab.c4_scrolldiff, type:double, comment:null), FieldSchema(name:80t_lab.c1_disct, type:double, comment:null), FieldSchema(name:80t_lab.c2_disct, type:double, comment:null), FieldSchema(name:80t_lab.c3_disct, type:double, comment:null), FieldSchema(name:80t_lab.c4_disct, type:double, comment:null), FieldSchema(name:80t_lab.slp1_status, type:string, comment:null), FieldSchema(name:80t_lab.slp2_status, type:string, comment:null), FieldSchema(name:80t_lab.slp3_status, type:string, comment:null), FieldSchema(name:80t_lab.slp4_status, type:string, comment:null), FieldSchema(name:80t_lab.pb1_status, type:string, comment:null), FieldSchema(name:80t_lab.pb2_status, type:string, comment:null), FieldSchema(name:80t_lab.pb3_status, type:string, comment:null), FieldSchema(name:80t_lab.pb4_status, type:string, comment:null), FieldSchema(name:80t_lab.pb1opspeed, type:tinyint, comment:null), FieldSchema(name:80t_lab.pb2opspeed, type:tinyint, comment:null), FieldSchema(name:80t_lab.pb3opspeed, type:tinyint, comment:null), FieldSchema(name:80t_lab.pb4opspeed, type:tinyint, comment:null), FieldSchema(name:80t_lab.gc1sped_f1, type:tinyint, comment:null), FieldSchema(name:80t_lab.gc2sped_f1, type:tinyint, comment:null), FieldSchema(name:80t_lab.gc3sped_f1, type:tinyint, comment:null), FieldSchema(name:80t_lab.gc4sped_f1, type:tinyint, comment:null), FieldSchema(name:80t_lab.leadlag_swap, type:string, comment:null), FieldSchema(name:80t_lab.dxqs_status_glo, type:string, comment:null), FieldSchema(name:80t_lab.pb1_nextst, type:string, comment:null), FieldSchema(name:80t_lab.pb2_nextst, type:string, comment:null), FieldSchema(name:80t_lab.pb3_nextst, type:string, comment:null), FieldSchema(name:80t_lab.pb4_nextst, type:string, comment:null), FieldSchema(name:80t_lab.pre1_pump_request_mode, type:string, comment:null), FieldSchema(name:80t_lab.pre1_pumpstartupstatus, type:tinyint, comment:null), FieldSchema(name:80t_lab.pre1_diff_press, type:double, comment:null), FieldSchema(name:80t_lab.pre2_pump_request_mode, type:string, comment:null), FieldSchema(name:80t_lab.pre2_pumpstartupstatus, type:tinyint, comment:null), FieldSchema(name:80t_lab.pre2_diff_press, type:double, comment:null), FieldSchema(name:80t_lab.pre3_diff_press, type:double, comment:null), FieldSchema(name:80t_lab.pre4_diff_press, type:double, comment:null), FieldSchema(name:80t_lab.pre_early_cmprtomixed, type:tinyint, comment:null), FieldSchema(name:80t_lab.pre_early_mixedtopump, type:tinyint, comment:null), FieldSchema(name:80t_lab.pre_early_cmprtopump, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb1_tempamb, type:double, comment:null), FieldSchema(name:80t_lab.gcb1_cntrlstate, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb11_cntrlssrc, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb12_cntrlssrc, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb11_compstate, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb12_compstate, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb1_fan1actual, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb1_fan2actual, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb1_fan3actual, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb1_fan1spdrqst, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb1_fan2spdrqst, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb1_fan3spdrqst, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcc1_temprfrg, type:double, comment:null), FieldSchema(name:80t_lab.gcc1_tempcondset, type:double, comment:null), FieldSchema(name:80t_lab.gcc1_prescond, type:double, comment:null), FieldSchema(name:80t_lab.gcc1_prescondset, type:double, comment:null), FieldSchema(name:80t_lab.gcb1_temprefrig, type:double, comment:null), FieldSchema(name:80t_lab.gcb1_prescond, type:double, comment:null), FieldSchema(name:80t_lab.gcb1_prescondset, type:double, comment:null), FieldSchema(name:80t_lab.gcb2_tempamb, type:double, comment:null), FieldSchema(name:80t_lab.gcb2_cntrlstate, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb21_cntrlssrc, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb22_cntrlssrc, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb21_compstate, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb22_compstate, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb2_fan1actual, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb2_fan1spdrqst, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcc2_temprfrg, type:double, comment:null), FieldSchema(name:80t_lab.gcc2_tempcondset, type:double, comment:null), FieldSchema(name:80t_lab.gcc2_prescond, type:double, comment:null), FieldSchema(name:80t_lab.gcc2_prescondset, type:double, comment:null), FieldSchema(name:80t_lab.pre1_out_p, type:double, comment:null), FieldSchema(name:80t_lab.pre1_inlet_p, type:double, comment:null), FieldSchema(name:80t_lab.pre1_inlet_t, type:double, comment:null), FieldSchema(name:80t_lab.pre1_out_t, type:double, comment:null), FieldSchema(name:80t_lab.gcc4_temprfrg, type:double, comment:null), FieldSchema(name:80t_lab.gcc4_prescond, type:double, comment:null), FieldSchema(name:80t_lab.eev1_reqstate, type:string, comment:null), FieldSchema(name:80t_lab.eev1_actstate, type:string, comment:null), FieldSchema(name:80t_lab.eev1_error_state, type:string, comment:null), FieldSchema(name:80t_lab.eev1_vcm_error_state, type:string, comment:null), FieldSchema(name:80t_lab.eev1_vcm_sensorerror, type:string, comment:null), FieldSchema(name:80t_lab.eev1_vcm_systemalarm, type:string, comment:null), FieldSchema(name:80t_lab.eev1_vcm_systemstategrp, type:string, comment:null), FieldSchema(name:80t_lab.eev1_vcm_press_psig, type:double, comment:null), FieldSchema(name:80t_lab.eev1_vcm_supht_degf, type:double, comment:null), FieldSchema(name:80t_lab.eev1_vcm_sh_setpt_degf, type:double, comment:null), FieldSchema(name:80t_lab.eev1_vcm_valvpos, type:double, comment:null), FieldSchema(name:80t_lab.eev1_vcm_valvposreq, type:double, comment:null), FieldSchema(name:80t_lab.eev1_vcm_valvcntrlen, type:tinyint, comment:null), FieldSchema(name:80t_lab.eev1_vcm_diginput, type:string, comment:null), FieldSchema(name:80t_lab.eev1_vcm_digoutput, type:string, comment:null), FieldSchema(name:80t_lab.eev2_reqstate_byte_1, type:string, comment:null), FieldSchema(name:80t_lab.eev2_actstate_byte_2, type:string, comment:null), FieldSchema(name:80t_lab.eev2_error_state_byte_3, type:string, comment:null), FieldSchema(name:80t_lab.eev2_vcm_error_state_byte_4, type:string, comment:null), FieldSchema(name:80t_lab.eev2_vcm_sensorerror, type:string, comment:null), FieldSchema(name:80t_lab.eev2_vcm_systemalarm, type:string, comment:null), FieldSchema(name:80t_lab.eev2_vcm_systemstategrp, type:string, comment:null), FieldSchema(name:80t_lab.eev2_vcm_press_psig, type:double, comment:null), FieldSchema(name:80t_lab.eev2_vcm_supht_degf, type:double, comment:null), FieldSchema(name:80t_lab.eev2_vcm_sh_setpt_degf, type:double, comment:null), FieldSchema(name:80t_lab.eev2_vcm_valvpos, type:double, comment:null), FieldSchema(name:80t_lab.eev2_vcm_valvposreq, type:double, comment:null), FieldSchema(name:80t_lab.eev2_vcm_valvcntrlen, type:tinyint, comment:null), FieldSchema(name:80t_lab.eev2_vcm_diginput, type:string, comment:null), FieldSchema(name:80t_lab.eev2_vcm_digoutput, type:string, comment:null), FieldSchema(name:80t_lab.pre2_out_p, type:double, comment:null), FieldSchema(name:80t_lab.pre2_inlet_p, type:double, comment:null), FieldSchema(name:80t_lab.pre2_inlet_t, type:double, comment:null), FieldSchema(name:80t_lab.pre2_out_t, type:double, comment:null), FieldSchema(name:80t_lab.eev4_supht, type:double, comment:null), FieldSchema(name:80t_lab.eev4_valvpos, type:tinyint, comment:null), FieldSchema(name:80t_lab.anaoutact1, type:int, comment:null), FieldSchema(name:80t_lab.anaoutact2, type:int, comment:null), FieldSchema(name:80t_lab.anaoutact3, type:int, comment:null), FieldSchema(name:80t_lab.anaoutact4, type:int, comment:null), FieldSchema(name:80t_lab.anain0, type:int, comment:null), FieldSchema(name:80t_lab.anain1, type:int, comment:null), FieldSchema(name:80t_lab.anain2, type:int, comment:null), FieldSchema(name:80t_lab.anain3, type:int, comment:null), FieldSchema(name:80t_lab.anain4, type:int, comment:null), FieldSchema(name:80t_lab.anain5, type:int, comment:null), FieldSchema(name:80t_lab.anain6, type:int, comment:null), FieldSchema(name:80t_lab.anain7, type:int, comment:null), FieldSchema(name:80t_lab.digou1_ss_q01, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou2_ss_q02, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou3_ss_q03, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou9i_ssq04, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou4_ss_q06, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou15_ssq08, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou14_ssq09, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou13_ssq10, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou12_ssq12, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou11_ssq13, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou10_ssq14, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou16_ssq15, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou5_ss_q17, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou6_ss_q18, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou7a_ssk03, type:tinyint, comment:null), FieldSchema(name:80t_lab.digou0w_ssk11, type:tinyint, comment:null), FieldSchema(name:80t_lab.digin0_ss_u15, type:tinyint, comment:null), FieldSchema(name:80t_lab.digin1_ss_u16, type:tinyint, comment:null), FieldSchema(name:80t_lab.digin2_ss, type:tinyint, comment:null), FieldSchema(name:80t_lab.diginp0_ssu17, type:tinyint, comment:null), FieldSchema(name:80t_lab.diginp1_ssu18, type:tinyint, comment:null), FieldSchema(name:80t_lab.diginp2_ssu19, type:tinyint, comment:null), FieldSchema(name:80t_lab.diginp3_ssu20, type:tinyint, comment:null), FieldSchema(name:80t_lab.diginp4_ssu21, type:tinyint, comment:null), FieldSchema(name:80t_lab.diginp5_ssu22, type:tinyint, comment:null), FieldSchema(name:80t_lab.diginp6_ssu23, type:tinyint, comment:null), FieldSchema(name:80t_lab.digin13_ssu24, type:tinyint, comment:null), FieldSchema(name:80t_lab.diginau_ssu25, type:tinyint, comment:null), FieldSchema(name:80t_lab.diginhu_ssu26, type:tinyint, comment:null), FieldSchema(name:80t_lab.diginp7_ssu27, type:tinyint, comment:null), FieldSchema(name:80t_lab.digin14_ssu28, type:tinyint, comment:null), FieldSchema(name:80t_lab.digin15_ssu29, type:tinyint, comment:null), FieldSchema(name:80t_lab.eev_pre_st_glo, type:string, comment:null), FieldSchema(name:80t_lab.pre1_speedlimitrel, type:tinyint, comment:null), FieldSchema(name:80t_lab.pre2_speedlimitrel, type:tinyint, comment:null), FieldSchema(name:80t_lab.peeoopen, type:tinyint, comment:null), FieldSchema(name:80t_lab.peeost1, type:string, comment:null), FieldSchema(name:80t_lab.peeost2, type:string, comment:null), FieldSchema(name:80t_lab.gcb1_vsdspeedmodemax, type:tinyint, comment:null), FieldSchema(name:80t_lab.gcb2_vsdspeedmodemax, type:tinyint, comment:null), FieldSchema(name:80t_lab.peeosof1, type:tinyint, comment:null), FieldSchema(name:80t_lab.peeosof2, type:tinyint, comment:null), FieldSchema(name:80t_lab.fan_glo (q5), type:tinyint, comment:null)], properties:null)
INFO  : Completed compiling command(queryId=hive_20220131074416_e8fd6276-d028-42d0-9556-6d35bc7e1714); Time taken: 1.01 seconds&lt;/LI-CODE&gt;</description>
      <pubDate>Wed, 02 Feb 2022 08:44:23 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/335465#M231951</guid>
      <dc:creator>vladenache</dc:creator>
      <dc:date>2022-02-02T08:44:23Z</dc:date>
    </item>
    <item>
      <title>Re: create table as select failed, but only when stored as parquet</title>
      <link>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/336583#M232310</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/95457"&gt;@vladenache&lt;/a&gt;&amp;nbsp;The issue seems to be with field&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;FieldSchema(name:80t_lab.fan_glo (q5), type:tinyint, comment:null)], properties:null)&lt;/LI-CODE&gt;&lt;P&gt;Please check schema of the table for above field and correct it's name. From the attached stack trace it is trying to identify q5 as a data type and failing as it does not exist&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 18 Feb 2022 08:59:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/336583#M232310</guid>
      <dc:creator>tarak271</dc:creator>
      <dc:date>2022-02-18T08:59:01Z</dc:date>
    </item>
    <item>
      <title>Re: create table as select failed, but only when stored as parquet</title>
      <link>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/336588#M232313</link>
      <description>&lt;P&gt;thanks for noticing this, i missed it!&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;the table is a result of a sqoop job from mysql, so Ii guess the column name had q5 in it and somehow it ended up defined as a type ?!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 18 Feb 2022 11:45:24 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/create-table-as-select-failed-but-only-when-stored-as/m-p/336588#M232313</guid>
      <dc:creator>vladenache</dc:creator>
      <dc:date>2022-02-18T11:45:24Z</dc:date>
    </item>
  </channel>
</rss>

