Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

How to add column to duplicated query?

I have a table called logs that contains these columns:

id - message - source - created_at - version - ...(another columns here)

I want to count the message that has been duplicated in logs and get three columns of the query which are

source - created_at - version

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

Note:

  • source has a fixed value
  • created_at is not a fixed value (I want to get the latest date from the last record)
  • version is not a fixed value (I want to get the last version from the last record)

my wrong query:

SELECT
    MESSAGE,
    SOURCE,
--  VERSION,
    COUNT(id)
FROM
    logs
WHERE
    LEVEL = 'debug'
GROUP BY
    MESSAGE,
    SOURCE
--  VERSION
HAVING
    COUNT(id) > 1
ORDER BY
    COUNT(id)
    DESC;

The result is I get the wrong number of counting cuz of the value of version is not the same in all rows!

So how can get the number of duplicate messages with source - created_at - version could

>Solution :

When you want the last version, create an array of versions and order them by the timestamp. When done, pick the first one:

SELECT
    MESSAGE,
    SOURCE,
    MAX(created_at),
    (ARRAY_AGG(VERSION ORDER BY created_at DESC))[1] AS last_version, -- use an ORDER BY
    COUNT(id)
FROM
    logs
WHERE
    LEVEL = 'debug'
GROUP BY
    MESSAGE,
    SOURCE
HAVING
    COUNT(id) > 1
ORDER BY
    COUNT(id) DESC;
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading