<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: convert milliseconds data frame column  into Unixtimestamp using sprak scala ?? in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/convert-milliseconds-data-frame-column-into-Unixtimestamp/m-p/192641#M154706</link>
    <description>&lt;A rel="user" href="https://community.cloudera.com/users/12747/swathidataengineer.html" nodeid="12747"&gt;@swathi thukkaraju&lt;/A&gt;&lt;P&gt;As you are using &lt;STRONG&gt;timestamp &lt;/STRONG&gt;field data type as string, can you cast that to Bigint or int as per your requirements then from_unixtime will work.&lt;/P&gt;&lt;P&gt;Possible Outputs for your timestamp value &lt;STRONG&gt;1465876799, &lt;/STRONG&gt;you can check them in hive (or) beeline shell.&lt;/P&gt;&lt;PRE&gt;hive&amp;gt; select from_unixtime(1465876799, 'yyyy-MM-dd');&lt;BR /&gt;2016-06-13&lt;/PRE&gt;&lt;PRE&gt;hive&amp;gt; select from_unixtime(CAST(1465876799000 as int), 'yyyy-MM-dd');  &lt;BR /&gt;2010-12-21&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;hive&amp;gt; select from_unixtime(CAST(1465876799000 as bigint), 'yyyy-MM-dd');
48421-10-14&lt;/PRE&gt;&lt;PRE&gt;select from_unixtime(CAST(1465876799000/1000 as BIGINT), 'yyyy-MM-dd');&lt;BR /&gt;2016-06-13&lt;/PRE&gt;&lt;STRONG&gt;&lt;U&gt;Error:-&lt;/U&gt;&lt;/STRONG&gt;&lt;PRE&gt;hive&amp;gt; select from_unixtime(CAST(1465876799000 as string), 'yyyy-MM-dd'); 
FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments ''yyyy-MM-dd'': No matching method for class org.apache.hadoop.hive.ql.udf.UDFFromUnixTime with (string, string). Possible choices: _FUNC_(bigint)  _FUNC_(bigint, string)  _FUNC_(int)  _FUNC_(int, string)&lt;/PRE&gt;&lt;P&gt;As you can view above i did
cast 1465876799000 as string but it is giving error with possible choices are
bigint,int.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;Possible Query for your case:-&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;PRE&gt;val df = sqlContext.sql("select from_unixtime(cast(timestamp as bigint),'YYYY-MM-dd') as 'ts' from stamp")&lt;/PRE&gt;&lt;P&gt;(or)&lt;/P&gt;&lt;P&gt;change data type in case class&lt;/P&gt;&lt;PRE&gt;case class flight(display_id: Int ,uuid:String, document_id :Int, timestamp:BigInt, platformgeo_location:String)val df = sqlContext.sql("select from_unixtime(timestamp,'YYYY-MM-dd') as 'ts' from stamp")&lt;/PRE&gt;&lt;P&gt;I have mentioned all the possible outputs above by testing them in hive shell by using datatypes as int,bigint.&lt;/P&gt;&lt;P&gt;You can pick which is best fit for your case.&lt;/P&gt;</description>
    <pubDate>Fri, 10 Nov 2017 09:36:44 GMT</pubDate>
    <dc:creator>Shu_ashu</dc:creator>
    <dc:date>2017-11-10T09:36:44Z</dc:date>
    <item>
      <title>convert milliseconds data frame column  into Unixtimestamp using sprak scala ??</title>
      <link>https://community.cloudera.com/t5/Support-Questions/convert-milliseconds-data-frame-column-into-Unixtimestamp/m-p/192640#M154705</link>
      <description>&lt;P&gt;event csv file looks like this&lt;/P&gt;&lt;P&gt; |display_id|          uuid|document_id|timestamp|platformgeo_location|&lt;/P&gt;&lt;P&gt;1|cb8c55702adb93|     379743|       61|                   3|
|&lt;/P&gt;&lt;P&gt;         2|79a85fa78311b9|    1794259|       81|                   2|
|&lt;/P&gt;&lt;P&gt;         3|822932ce3d8757|    1179111|      182|                   2|
| &lt;/P&gt;&lt;P&gt;        4|85281d0a49f7ac|    1777797|      234|                   2|
| &lt;/P&gt;&lt;P&gt;This my code spark scala code &lt;/P&gt;&lt;UL&gt;&lt;LI&gt;import org.joda.time._ &lt;/LI&gt;&lt;LI&gt;case class flight(display_id: Int ,uuid:String, document_id :Int, timestamp:String, platformgeo_location:String) &lt;/LI&gt;&lt;LI&gt;valstreamdf=sc.textFile("/FileStore/tables/y6ak4fzq1504260076447/events.csv").map(_.split(",")).map(x=&amp;gt;flight(x(0).toInt,x(1).toString,x(2).toInt,x(3).toString,x(4).toString)).toDF() &lt;/LI&gt;&lt;LI&gt;streamdf.show() &lt;/LI&gt;&lt;LI&gt;streamdf.registerTempTable("event1") &lt;/LI&gt;&lt;LI&gt;val result = sqlContext.sql("select * from event1 limit 10")&lt;/LI&gt;&lt;LI&gt;val addP = (p: Int) =&amp;gt; udf( (x: Int) =&amp;gt; x + p )
val stamp = streamdf.withColumn("timestamp", addP(1465876799)($"timestamp")).toDF() &lt;/LI&gt;&lt;LI&gt;stamp.show()
stamp.registerTempTable("stamp")&lt;/LI&gt;&lt;LI&gt;&lt;STRONG&gt;new org.joda.time.DateTime(1465876799*1000)&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;&lt;STRONG&gt;val df = sqlContext.sql("select from_unixtime(timestamp,'YYYY-MM-dd') as 'ts' from stamp")&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;&lt;STRONG&gt;when am exceuting last command which is in bold&lt;/STRONG&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;STRONG&gt;val df &lt;/STRONG&gt;type mismatch error am getting&lt;/P&gt;&lt;P&gt;how to resolve this problem please help me out &lt;/P&gt;&lt;P&gt;Thanks in advance&lt;/P&gt;&lt;P&gt;swathi.T&lt;/P&gt;</description>
      <pubDate>Thu, 09 Nov 2017 22:48:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/convert-milliseconds-data-frame-column-into-Unixtimestamp/m-p/192640#M154705</guid>
      <dc:creator>Former Member</dc:creator>
      <dc:date>2017-11-09T22:48:22Z</dc:date>
    </item>
    <item>
      <title>Re: convert milliseconds data frame column  into Unixtimestamp using sprak scala ??</title>
      <link>https://community.cloudera.com/t5/Support-Questions/convert-milliseconds-data-frame-column-into-Unixtimestamp/m-p/192641#M154706</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/12747/swathidataengineer.html" nodeid="12747"&gt;@swathi thukkaraju&lt;/A&gt;&lt;P&gt;As you are using &lt;STRONG&gt;timestamp &lt;/STRONG&gt;field data type as string, can you cast that to Bigint or int as per your requirements then from_unixtime will work.&lt;/P&gt;&lt;P&gt;Possible Outputs for your timestamp value &lt;STRONG&gt;1465876799, &lt;/STRONG&gt;you can check them in hive (or) beeline shell.&lt;/P&gt;&lt;PRE&gt;hive&amp;gt; select from_unixtime(1465876799, 'yyyy-MM-dd');&lt;BR /&gt;2016-06-13&lt;/PRE&gt;&lt;PRE&gt;hive&amp;gt; select from_unixtime(CAST(1465876799000 as int), 'yyyy-MM-dd');  &lt;BR /&gt;2010-12-21&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;hive&amp;gt; select from_unixtime(CAST(1465876799000 as bigint), 'yyyy-MM-dd');
48421-10-14&lt;/PRE&gt;&lt;PRE&gt;select from_unixtime(CAST(1465876799000/1000 as BIGINT), 'yyyy-MM-dd');&lt;BR /&gt;2016-06-13&lt;/PRE&gt;&lt;STRONG&gt;&lt;U&gt;Error:-&lt;/U&gt;&lt;/STRONG&gt;&lt;PRE&gt;hive&amp;gt; select from_unixtime(CAST(1465876799000 as string), 'yyyy-MM-dd'); 
FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments ''yyyy-MM-dd'': No matching method for class org.apache.hadoop.hive.ql.udf.UDFFromUnixTime with (string, string). Possible choices: _FUNC_(bigint)  _FUNC_(bigint, string)  _FUNC_(int)  _FUNC_(int, string)&lt;/PRE&gt;&lt;P&gt;As you can view above i did
cast 1465876799000 as string but it is giving error with possible choices are
bigint,int.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;Possible Query for your case:-&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;PRE&gt;val df = sqlContext.sql("select from_unixtime(cast(timestamp as bigint),'YYYY-MM-dd') as 'ts' from stamp")&lt;/PRE&gt;&lt;P&gt;(or)&lt;/P&gt;&lt;P&gt;change data type in case class&lt;/P&gt;&lt;PRE&gt;case class flight(display_id: Int ,uuid:String, document_id :Int, timestamp:BigInt, platformgeo_location:String)val df = sqlContext.sql("select from_unixtime(timestamp,'YYYY-MM-dd') as 'ts' from stamp")&lt;/PRE&gt;&lt;P&gt;I have mentioned all the possible outputs above by testing them in hive shell by using datatypes as int,bigint.&lt;/P&gt;&lt;P&gt;You can pick which is best fit for your case.&lt;/P&gt;</description>
      <pubDate>Fri, 10 Nov 2017 09:36:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/convert-milliseconds-data-frame-column-into-Unixtimestamp/m-p/192641#M154706</guid>
      <dc:creator>Shu_ashu</dc:creator>
      <dc:date>2017-11-10T09:36:44Z</dc:date>
    </item>
  </channel>
</rss>

