I’m having an issue , and it’s that ive an api that timesout when doing 1000 requests to it with Promise.all.
This is the current code that fails
const retry = async (
requests: API.CarFailedRequest[]
) => {
setIsLoading(true);
const res = await Promise.all(
requests.map(async request => {
try {
await service.retryFailedRequest(request);
return { status: true, request };
} catch (e) {
return { status: false, request };
}
})
);
setIsLoading(false);
};
The code above uses a Promise.all and no distintctions so seems logic that it fails with 1000 requests, and starts giving 403 randomly.
So i have tried multiple solutions, one of them was using Bluebird concurrency.
const retry = async (
requests: API.CarFailedRequest[]
) => {
setIsLoading(true);
const promises = requests.map(async request => {
try {
await service.retryFailedRequest(request);
return true;
} catch (e) {
return false;
}
});
await BlueBirdPromise.map(
promises,
async promise => {
try {
await promise;
} catch (err) {
console.log(err);
}
},
{ concurrency: 10 }
);
setIsLoading(false);
};
return {
failedRequestData: { originTypes, errorTypes, statuses },
retryFailedRequests
};
};
But this doesnt seem to change anything. I still see a massive spam in the network tab in the browser , of pending requests.
I have also tried this function to natively delay chunks of network requests
const processPromisesWithDelay = async (promises: any[], delay: number, split: number) => {
const chunks = [];
// Split the array of promises into chunks
for (let i = 0; i < promises.length; i += split) {
chunks.push(promises.slice(i, i + split));
}
// Process each chunk of promises with delay
for (const chunk of chunks) {
await Promise.all(chunk.map((promise: () => any) => promise()));
// Delay for the specified amount of time
await new Promise((resolve) => setTimeout(resolve, delay * 1000));
}
};
const retry = async (
requests: API.CarFailedRequest[]
) => {
setIsLoading(true);
const promises = requests.map(async request => {
await service.retryFailedRequest(request);
});
await processPromisesWithDelay(promises, 5, 5);
setIsLoading(false);
};
return {
failedRequestData: { originTypes, errorTypes, statuses },
retryFailedRequests
};
};
But i still see a massive amount of requests in the network tab (1000) , that resolve one after other, with no delays between them, and the chunks dont seem to be parallel or anything
>Solution :
That’s mainly due to the fact that you are AWAITING the promises in chunks, but you are starting them all at the same time.
const retry = async (
requests: API.CarFailedRequest[]
) => {
setIsLoading(true);
await BlueBirdPromise.map(
requests,
async request=> {
try {
await service.retryFailedRequest(request);
} catch (err) {
console.log(err);
}
},
{ concurrency: 10 }
);
setIsLoading(false);
};
return {
failedRequestData: { originTypes, errorTypes, statuses },
retryFailedRequests
};
};
This way, the BlueBird map is actually doing the request.