Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

how to convert date of format string to timestamp in spark?

%scala
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions.{col, to_date}

Seq(("20110813"),("20090724")).toDF("Date").select(
col("Date"),
to_date(col("Date"),"yyyy-mm-dd").as("to_date")
).show()

+--------+-------+
|    Date|to_date|
+--------+-------+
|20110813|   null|
|20090724|   null|
+--------+-------+
+--------+----------+
|    Date|   to_date|
+--------+----------+
|20110813|2011-01-13|
|20090724|2009-01-24|
+--------+----------+
Seq(("20110813"),("20090724")).toDF("Date").select(
col("Date"),
to_date(col("Date"),"yyyymmdd").as("to_date")
).show()

I am trying to convert a string to timestamp, but I am getting always null/default values returned to the date value

>Solution :

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

You haven’t given value for the new column to convert. you should use withColumn to add the new date column and tell him to use Date column values.

val df = Seq(("20110813"),("20090724")).toDF("Date")
val newDf = df.withColumn("to_date", to_date(col("Date"), "yyyy-MM-dd"))
newDf.show()
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading