@swathi thukkarajuAs you are using timestamp field data type as string, can you cast that to Bigint or int as per your requirements then from_unixtime will work.
Possible Outputs for your timestamp value 1465876799, you can check them in hive (or) beeline shell.
hive> select from_unixtime(1465876799, 'yyyy-MM-dd');
2016-06-13
hive> select from_unixtime(CAST(1465876799000 as int), 'yyyy-MM-dd');
2010-12-21
hive> select from_unixtime(CAST(1465876799000 as bigint), 'yyyy-MM-dd');
48421-10-14
select from_unixtime(CAST(1465876799000/1000 as BIGINT), 'yyyy-MM-dd');
2016-06-13
Error:-hive> select from_unixtime(CAST(1465876799000 as string), 'yyyy-MM-dd');
FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments ''yyyy-MM-dd'': No matching method for class org.apache.hadoop.hive.ql.udf.UDFFromUnixTime with (string, string). Possible choices: _FUNC_(bigint) _FUNC_(bigint, string) _FUNC_(int) _FUNC_(int, string)
As you can view above i did
cast 1465876799000 as string but it is giving error with possible choices are
bigint,int.
Possible Query for your case:-
val df = sqlContext.sql("select from_unixtime(cast(timestamp as bigint),'YYYY-MM-dd') as 'ts' from stamp")
(or)
change data type in case class
case class flight(display_id: Int ,uuid:String, document_id :Int, timestamp:BigInt, platformgeo_location:String)val df = sqlContext.sql("select from_unixtime(timestamp,'YYYY-MM-dd') as 'ts' from stamp")
I have mentioned all the possible outputs above by testing them in hive shell by using datatypes as int,bigint.
You can pick which is best fit for your case.