Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

convert weird date string into date format in pyspark

I have a csv file with the date column. The date is coming in a strange format. Below are some examples:

  • May the 9th of 2022
  • September the 17th of 2022
  • June the 09th of 2022

I am creating a glue job to load data into Redshift.

How do I convert these weird looking string into YYYY-MM-DD format using pyspark data frame.

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

>Solution :

you can use to_date and pass the source format — "MMMM 'the' d'th of' yyyy".

As brought up by blackbishop, use "MMMM 'the' dd['st']['nd']['rd']['th'] 'of' yyyy" to handle all cases (1st, 2nd, 3rd, 4th …)

spark.sparkContext.parallelize([('May the 9th of 2022',), ('September the 17th of 2022',)]).toDF(['dt_str']). \
    withColumn('dt', func.to_date('dt_str', "MMMM 'the' d'th of' yyyy")). \
    show(truncate=False)

# +--------------------------+----------+
# |dt_str                    |dt        |
# +--------------------------+----------+
# |May the 9th of 2022       |2022-05-09|
# |September the 17th of 2022|2022-09-17|
# +--------------------------+----------+
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading