Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Calculating time intervals considering business days and 24×7

I have the following Postgresql structure and data :

CREATE TABLE requests (
    entity_id INTEGER,
    type_de_demande TEXT,
    phase TEXT,
    time BIGINT,
    next_time BIGINT
);

INSERT INTO requests (entity_id, type_de_demande, phase, time, next_time)
VALUES
    (3402670, 'SUPPORTREQUEST', 'Log', 1684836635813, 1684836637569),
    (3402670, 'SUPPORTREQUEST', 'Classify', 1684836637569, 1684836638257),
    (3402670, 'SUPPORTREQUEST', 'FirstLineSupport', 1684836638257, 1685090241035),
    (3402670, 'SUPPORTREQUEST', 'Escalate', 1685090241035, 1685090241833),
    (3402670, 'SUPPORTREQUEST', 'Close', 1685090241833, NULL)

The objective is to calculate the time interval between "time" and "next_time" in business time (excluding Saturday and Sunday) and 24×7 time. Currently, I am able to obtain the correct interval for 24×7 time, but for the business interval, I am getting incorrect values.

Here is my SQL select query :

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

SELECT
    t.entity_id,
    t.type_de_demande,
    t.phase,
    bd.bus_interval,
    bd.interval_24x7,
    ts.start_ts,
    ts.end_ts,
    t.time,
    t.next_time
FROM requests AS t
inner join lateral (
    SELECT 
        to_timestamp(t.time / 1000)::timestamp      AS start_ts,
        to_timestamp(t.next_time / 1000)::timestamp AS end_ts,
        (SELECT COUNT(*)::int 
         FROM generate_series(to_timestamp(t.time / 1000)::DATE, to_timestamp(t.next_time / 1000)::DATE, '1 day')
         WHERE EXTRACT(ISODOW FROM generate_series) < 6
        ) AS bus_days
) AS ts ON true
inner join lateral (
    SELECT 
        ts.end_ts - ts.start_ts as interval_24x7,
        make_interval(0,0,0,ts.bus_days - 1) + ((ts.end_ts - ts.start_ts)::time)::interval  as bus_interval
) as bd on true

Result :

entity_id type_de_demande phase bus_interval interval_24x7 start_ts end_ts time next_time
3402670 SUPPORTREQUEST Log {"seconds":2} {"seconds":2} 2023-05-23T10:10:35.000Z 2023-05-23T10:10:37.000Z 1684836635813 1684836637569
3402670 SUPPORTREQUEST Classify {"seconds":1} {"seconds":1} 2023-05-23T10:10:37.000Z 2023-05-23T10:10:38.000Z 1684836637569 1684836638257
3402670 SUPPORTREQUEST FirstLineSupport {"days":3,"hours":22,"minutes":26,"seconds":43} {"days":2,"hours":22,"minutes":26,"seconds":43} 2023-05-23T10:10:38.000Z 2023-05-26T08:37:21.000Z 1684836638257 1685090241035
3402670 SUPPORTREQUEST Escalate {} {} 2023-05-26T08:37:21.000Z 2023-05-26T08:37:21.000Z 1685090241035 1685090241833
3402670 SUPPORTREQUEST Close null null 2023-05-26T08:37:21.000Z null 1685090241833 null

Why does the bus_interval column have an additional day {"days":3,"hours":22,"minutes":26,"seconds":43} instead of the expected value of {"days":2,"hours":22,"minutes":26,"seconds":43}? Is there another method to calculate the interval between two Unix timestamps considering both business days and 24×7 time?

Fiddle: https://www.db-fiddle.com/f/axnWR8TUupXywi1ddHDHtX/0

>Solution :

I think your bus_days sub query should be like following. I have removed ::DATE cast and also including generate_series timestamp only when its less than or equal to t.next_time.

(SELECT COUNT(*)::int 
         FROM generate_series(to_timestamp(t.time / 1000), to_timestamp(t.next_time / 1000), '1 day')
         WHERE generate_series <= to_timestamp(t.next_time / 1000) AND EXTRACT(ISODOW FROM generate_series) < 6
        ) AS bus_days
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading