Support Questions

Find answers, ask questions, and share your expertise

change datatype of external table

avatar
Expert Contributor

I had a external table that contains some string columns, now I need to change the datatype of some columns, so I used :

ALTER TABLE table CHANGE col col type;

but this query gives me a error:

org.apache.spark.sql.AnalysisException: ALTER TABLE CHANGE COLUMN is not supported for changing column 'id' with type 'StringType' to 'id' with type 'LongType';

any suggestion would be greatly welcome, thanks

2 REPLIES 2

avatar
Master Guru

@Yassine

Specify type as BigInt which is Equivalent to long type,hive don't have long datatype.

hive> alter table table change col col bigint;

for more reference

https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Types#LanguageManualTypes-IntegralTy...)

avatar
@Yassine

Looking at your log, it seems like you are trying to change the datatype in Spark. Is this the case? If yes, use the statement like

val a = sqlContext.sql("alter table tableName change col col bigint")

Talking about the issue you are facing while converting the type of the column, you need to understand the available datatypes and the implicit cast option available between them.

So whenever you issue a command like

alter table tableName change columnName columnName <newDataType>;

You need to understand that you may have some data in your Hive table's column which is string type now and if you are casting to a variable with datatype like int etc, you may not be able to access certain values and they will generate null.

Check this link for Hive datatypes and implicit cast options available.