Member since
04-04-2022
2
Posts
0
Kudos Received
0
Solutions
04-04-2022
07:07 AM
Spark SQL Complex Datatypes would be used in Case of Complex or Custom Requirements, where you would like to provide schema to your unstructured data, or sometimes even semi structured or structured data as well, you will also use these in Custom UDF's where you would use windowed operation and write you own advanced custom logics, and in Spark SQL you would explode that complex structure to get Dataframe's column. Use-Case may vary depending upon the requirement but underlying concept would remain same as in any programming language to handle data based on data-structures for which specific type is designed.
... View more
04-04-2022
06:43 AM
Increase spark.kryoserializer.buffer.max property value value according to the required size , by default it is 64 MB. Got same Exception, ran job by increasing the value and was able to run it properly.
... View more