Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Pyspark date intervals and between dates?

In Snowflake/SQL we can do:

SELECT * FROM myTbl 
WHERE date_col 
BETWEEN 
  CONVERT_TIMEZONE('UTC','America/Los_Angeles', some_date_string_col)::DATE - INTERVAL '7 DAY'
AND 
  CONVERT_TIMEZONE('UTC','America/Los_Angeles', some_date_string_col)::DATE - INTERVAL '1 DAY'

Is there a pyspark translation for this for dataframes?

I imagine if something like this

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

myDf.filter(
  col(date_col) >= to_utc_timestamp(...)
)

But how can we do BETWEEN and also the interval?

>Solution :

You simply need to use date_add (or date_sub) function here. Here’s an example :

Input DF:

df.show()
#+----------+
#|  date_col|
#+----------+
#|2021-11-18|
#|2021-11-19|
#|2021-11-26|
#|2021-11-28|
#|2021-11-29|
#+----------+

Filtering dates using between with date_add and current_date functions:

from pyspark.sql import functions as F

df1 = df.filter(
    F.col("date_col").between(
        F.date_add(F.current_date(), -7),
        F.date_add(F.current_date(), -1)
    )
)

df1.show()
#+----------+
#|  date_col|
#+----------+
#|2021-11-26|
#|2021-11-28|
#+----------+

You can however use INTERVAL within SQL expression:

df1 = df.filter(
        F.col("date_col").between(
            F.expr("current_timestamp - interval 7 days"),
            F.expr("current_timestamp - interval 1 days"),
        )
    )
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading