Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

MYSQL: SELECT latest date per group per group

I have a brainteaser in this one.
I already searched around but everything I find works finding the latest for 1 group. Not a group in a group.

I have a table with the following headers:

ID | vehicle id | type of cleaning | date

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

Every ID is unique. There are around 25 different vehicle id’s that can be in there for a more than 1 time. There are 3 types of cleaning and there is of course the date.
The idea is that every time a vehicle gets a cleaning this is registered in a new record. example: vehicle 110 is getting an interior cleaning on the 1th of October.
If I then have another interior cleaning of vehicle 110 on the 25th of October a new record is added so we have a database that records all cleanings.

For a quick view of which vehicle has been cleaned for every type of cleaning I need 1 query that gives me an array that tells me the vehicle number, the type and the LATEST date.

I have tried different options and the closest I came with:

SELECT * 
FROM cleanings
GROUP BY vehicle_id, type
ORDER BY vehicle_id ASC, date DESC;

This gives the first cleaning date per type per vehicle which mostly is the oldest date.

SAMPLE DATA:

INSERT INTO `cleanings` (`id`, `vehicle_id`, `employees_id`, `type`, `date`, `created`, `modified`) VALUES
(1, 15103, 1, 'Buitenkant', '2023-10-06', '2023-10-08 06:44:22', '2023-10-09 18:50:08'),
(2, 15103, 2, 'Buitenkant', '2023-10-08', '2023-10-08 06:44:34', '2023-10-09 18:54:25'),
(3, 15102, 2, 'Buitenkant', '2023-10-03', '2023-10-09 14:15:34', '2023-10-09 18:43:24'),
(4, 15101, 1, 'Buitenkant', '1989-11-10', '2023-10-09 14:22:53', '2023-10-09 17:35:17'),
(5, 15110, 2, 'Buitenkant', '2022-10-19', '2023-10-09 18:49:59', '2023-10-13 14:12:24'),
(6, 15110, 1, 'Buitenkant', '2022-10-18', '2023-10-08 04:44:22', '2023-10-13 14:12:17'),
(7, 15109, 2, 'Binnenkant', '2023-10-08', '2023-10-08 04:44:34', '2023-10-09 18:52:16'),
(8, 15116, 2, 'Binnenkant', '2023-10-03', '2023-10-09 12:15:34', '2023-10-09 18:52:16'),
(9, 15118, 1, 'Binnenkant', '1989-11-10', '2023-10-09 12:22:53', '2023-10-09 18:52:16'),
(10, 15119, 2, 'Binnenkant', '2023-10-19', '2023-10-09 16:49:59', '2023-10-09 18:52:16'),
(11, 15101, 1, 'Saniswiss', '2023-10-06', '2023-10-08 04:44:22', '2023-10-13 08:16:48'),
(12, 15103, 2, 'Saniswiss', '2023-10-08', '2023-10-08 04:44:34', '2023-10-09 18:52:59'),
(13, 15116, 2, 'Saniswiss', '2023-10-03', '2023-10-09 12:15:34', '2023-10-09 18:52:59'),
(14, 15111, 1, 'Saniswiss', '1989-11-10', '2023-10-09 12:22:53', '2023-10-09 18:52:59'),
(15, 15112, 2, 'Saniswiss', '2023-10-19', '2023-10-09 16:49:59', '2023-10-09 18:52:59'),
(16, 15105, 1, 'Buitenkant', '2023-10-09', '2023-10-09 18:58:33', '2023-10-09 18:58:33'),
(17, 15105, 2, 'Binnenkant', '2023-10-06', '2023-10-09 20:38:00', '2023-10-16 22:51:17'),
(18, 15112, 2, 'Binnenkant', '2023-12-12', '2023-10-16 22:51:32', '2023-10-16 22:51:32'),
(19, 15590, 15, 'Saniswiss', '2024-01-01', '2023-10-17 12:35:48', '2023-10-30 21:02:02'),
(21, 15101, 1, 'Buitenkant', '2023-10-20', '2023-10-17 14:46:29', '2023-10-17 14:46:29'),
(29, 15107, 15, 'Buitenkant', '2023-11-24', '2023-11-01 11:34:28', '2023-11-01 11:34:28'),
(30, 15215, 14, 'Saniswiss', '2023-11-20', '2023-11-01 11:35:32', '2023-11-01 11:35:32'),
(31, 15213, 1, 'Buitenkant', '2023-11-01', '2023-11-01 11:35:41', '2023-11-01 11:35:41'),
(32, 15118, 1, 'Binnenkant', '2023-11-02', '2023-11-02 10:02:51', '2023-11-02 10:02:51'),
(33, 15118, 1, 'Binnenkant', '2023-11-02', '2023-11-02 10:03:21', '2023-11-02 10:03:21'),
(34, 15118, 1, 'Buitenkant', '2023-11-02', '2023-11-02 10:05:08', '2023-11-02 10:05:08'),
(35, 15101, 1, 'Buitenkant', '2023-11-02', '2023-11-02 10:55:38', '2023-11-02 10:55:38');

SAMPLE RESULT FROM QUERY:

array (size=21)
  0 => 
    array (size=7)
      'id' => int 4
      'vehicle_id' => int 15101
      'employees_id' => int 1
      'type' => string 'Buitenkant' (length=10)
      'date' => string '1989-11-10' (length=10)
      'created' => string '2023-10-09 16:22:53' (length=19)
      'modified' => string '2023-10-09 19:35:17' (length=19)
  1 => 
    array (size=7)
      'id' => int 11
      'vehicle_id' => int 15101
      'employees_id' => int 1
      'type' => string 'Saniswiss' (length=9)
      'date' => string '2023-10-06' (length=10)
      'created' => string '2023-10-08 06:44:22' (length=19)
      'modified' => string '2023-10-13 10:16:48' (length=19)
  2 => 
    array (size=7)
      'id' => int 3
      'vehicle_id' => int 15102
      'employees_id' => int 2
      'type' => string 'Buitenkant' (length=10)
      'date' => string '2023-10-03' (length=10)
      'created' => string '2023-10-09 16:15:34' (length=19)
      'modified' => string '2023-10-09 20:43:24' (length=19)

SAMPLE END RESULT:
Sample result

>Solution :

Taking the LEFT JOIN approach from https://dev.mysql.com/doc/refman/8.0/en/example-maximum-column-group-row.html, something along these lines may give you the result you’re looking for. The join conditions mean that for each row we look for another row for the same vehicle and type but a later date. The WHERE clause then enforces the condition that there should be no such matching row (which can only be true if the row is already the latest for the vehicle and type combination)

SELECT c1.* 
FROM cleanings c1
LEFT JOIN cleanings c2 ON
  c1.vehicle_id = c2.vehicle_id
  AND c1.type = c2.type
  AND c2.date > c1.date
WHERE c2.ID IS NULL
ORDER BY c1.vehicle_id ASC, c1.date DESC;
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading