Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Laravel eloquent query slow with 100000 records

I have this simple eloquent query, works fine, but with few records.
When database increments until 100000 records become very slow.

I read should be use chunk instead of get. How can I implement it for this query?

$collection = Contact::with('shop');
$collection = $collection->orderBy('created_at', 'DESC');
$collection = $collection->get();
$json = $collection->map(function ($contact) {
    return [ 
        'id' => $contact->id,
        'name' => $contact->name,   
        ...about 50 columns more.
        'shop' => [
                'id' => optional($contact->shop)->id,
                'name' => optional($contact->shop)>name
        ],  
        ...about 6 relations more.
    ];
});
$json = $json->paginate(50);
return response()->json(['contacts' => $json], 200);

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

>Solution :

You are converting getting all the data like 1M or how many records it has. Then you are mapping it and paginate it and getting only 50. There is huge performance problem with your code.

You can directly call like this:

return response()->json(['contacts' => Contact::with('shop')->orderBy('created_at', 'DESC')->paginate(50)], 200);

If you only need id and name for contacts:

return response()->json(['contacts' => Contact::select('id', 'name', 'created_at')->orderBy('created_at', 'DESC')->paginate(50)], 200);
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading