Support Questions
Find answers, ask questions, and share your expertise

Facing Error while inserting data into map with sparksql in spark 2.3

New Contributor

Hello there,
I have INSERT OVERWRITE query . The target table has one of the column as map data-type. I want to insert values in that. While running the query, I'm facing the below error. Please assist. Thanks!

Query ran -
INSERT OVERWRITE TABLE table2 PARTITION (year) SELECT map(" "," "),
'#',
-99 FROM table1;

Error faced :-
org.apache.spark.sql.AnalysisException: Cannot have map type columns in DataFrame which calls set operations(intersect, except, etc.), but the type of column map( , ) is map<string,string>;;", "traceback": [ "InsertIntoHiveTable `....