How does next.js 13 / react server components solve the Network latency issue?

Let’s say I have a mostly static, content focused site built in next js.

Currently I would ssg or ssr all the pages, then put a CDN in front and configure headers so that my static assets get cached on the CDN.
So network latency is now very low as users globally will always quickly get a response from my CDN. But I’m still shipping a lot of "useless" code like the libraries I use to fetch and format the content I want to display.

React server components on the other hand make it possible to reduce down my bundle size by streaming the HTML from my components straight to the frontend, eliminating all of that useless code.
This is great but it requires a server to run, meaning I can’t throw it on a CDN, meaning network latency for users on the other side of the world will be really high. (or extremely complicated to set up with multiple servers across the world).

In the next 13 release talk at 1:43 they address that limitation specifically, but I don’t see them mention any solution to it.

Am I missing something from their talk or is there a hole somewhere in my understanding?

>Solution :

Next.js 13 server components do not have to run on the server for every single incoming request (unless you are using request-specific functions like cookies() or headers()), so you should still be able to utilize CDN for static assets produced by server components at build time (or re-validate at pre-defined intervals, or even on-demand).

The following documentation describes caching and re-validation strategies in great details:

And at the same time you get the benefits of reduced bundle size by not shipping server components’ code and dependencies to the client side.

Leave a Reply