Caching in next.js. Gift or Curse
In version 13, the next.js team introduced a new approach to application design — the so-called App Router. In version 14, it was made stable and primary for new applications.
The App Router significantly expands the functionality of next.js — partial pre-rendering, templates, parallel and interceptable routes, server components, and much more. However, despite all these improvements, not everyone has decided to switch to the App Router. And there are reasons for that.
I briefly discussed the advantages and problems of the new router in the article “Next.js App Router. Experience of use. The path to the future or the wrong turn”. Further, the conversation will not be about new abstractions or their features. In fact, the key and most controversial change is caching. This article will explain what, why, and how the most popular frontend framework, Next.js, caches.
What does next.js cache?
On the next.js website, you can find excellent documentation on the caching process. First, a brief overview of the main points from the article.
Any request in next.js triggered through fetch will be memoized and cached. The same will happen with pages and the cache function. How this works under the hood will be discussed in the following sections. The general page building process works as follows:
That is: the user goes to the page, a request for a route is sent to the server, the server starts rendering the route, sending the necessary requests along the way. Then all this is executed and cached.
In addition to the cache in the scheme, there is also memoization. It is needed for recurring requests — so that they are not sent several times, but are subscribed to the first one.
Caching on the server is done using the so-called Data Cache. You can remove data from it by calling the revalidatePath
and revalidateTag
functions. The first one will update the cache for the page, the second one for the tag specified in the requests.
Data is also cached on the client side — inside the client router.
Not mentioned in the article — next.js also caches rewrites and redirects. That is, if the user was once redirected from the /
page to the /login
on the server - now he will continue to be redirected there. This will be cached in the client router until the client cache is cleared.
You can clear the cache on the client using router.refresh
or by calling revalidatePath
and revalidateTag
in server actions.
'use server'
import { revalidateTag } from 'next/cache'
export default async function submit() {
await addPost()
revalidateTag('posts')
}
Why is caching needed in next.js?
The fetch from next.js is a wrapper over the native node.js fetch. The wrapper is configured to connect with the so-called Data Cache. This is done so that each request can be processed as described in the schemes above. The next.js team is most often criticized by the community for this replacement of the native API.
Later in next.js, the ability to disable caching of a request was added with the cache: "no-store"
option. But even with this option, it will continue to be memoized. As a result, one of the key APIs for development has ceased to be controlled by the developer.
Nevertheless, there were reasons for this step. And it is unlikely that the initial reason was optimization. For optimization, it would been enough to create a new function for requests — a separate API, of which there are hundreds in next.js.
I followed a similar path when developing the next-translation package (as I wrote in a previous article). Then an interesting problem arose — too many requests were going to the server (not triggered through fetch). Looking into the reasons and reading the next.js source code, it became clear that the application is now being built in several independent threads. Strange, why they didn’t talk about this in the latest releases. Each thread lives as an independent process, as a result, it was not possible to make normal caching for the entire application inside the package.
The same problem arose before the next.js team — each integration, each package, each user now began to send several times more requests, and the previously configured caching systems stopped working correctly. And as a solution — the remodeling of fetch and hiding this feature under the hood.
How does Data Cache work?
The saving of loaded or generated data occurs in the so-called cacheHandler. Out of the box, next.js has 2 options for cacheHandlers — FileSystem and Fetch. This cacheHandler will be used both for caching requests and for caching pages.
FileSystem is used by default, saves data to the file system, additionally memoizing in-memory. FileSystem copes well with its task, but it has one drawback — it works as part of the application. From this it follows that if the application is created in several replicas — each of them will have an independent cacheHandler.
This problem is especially felt when the application works in ISR mode. You need to get into each replica and revalidate the cache in each of them. At the same time, check that they will load the same data. Also, if 2 replicas work with one folder — conflicts can arise in the file system during recording.
Probably for this reason, you can find Fetch variant in the framework code. It saves the cache to a remote server. However, this cacheHandler is only used when publishing the application in Vercel, as it saves data on Vercel servers.
As a result, the out-of-the-box solution does not cover all needs — FileSystem is not suitable if there are several replicas, and Fetch if the application is not deployed in Vercel. An important feature is that next.js allows you to write your own cacheHandler. To do this, you need to pass to the application configuration the path to the file with the class (CacheHandler), in which the get, set and revalidateTag methods will be described:
// cache-handler.js
module.exports = class CacheHandler {
constructor(options) {
this.options = options
}
async get(key) {
// ...
}
async set(key, data, ctx) {
// ...
}
async revalidateTag(tag) {
// ...
}
}
And connect it in the application configuration:
module.exports = {
cacheHandler: require.resolve('./cache-handler.js'),
cacheMaxMemorySize: 0, // disable default in-memory caching
}
One of these cacheHandlers is cache-handler-redis, which the next.js team referred to in the last release.
Key points
Next.js caches a large part of the processes.
Caching occurs in several stages — caching transitions and pages in the client router, memoization of the request, caching on the server of requests and pages.
Quite often, applications are launched in several replicas. Replicas need a common cache, especially this is acute when the application works in ISR mode.
The application itself is assembled in several threads that do not have access to each other.
The cacheHandler is responsible for caching. Next.js has two out-of-the-box options — working with the file system and working with a remote server, but the latter is only available within Vercel.
You can write your own cacheHandler.
Caching refinement
Let’s go back to the next-translation package. To solve the problem of unnecessary requests, I came up with an interesting way out — to raise an additional server and process requests going through it — as a result, all requests go from one place, which means caching can be configured in it. This is a principle similar to FetchCacheHandler and the approach in Vercel in general — when during the build the data is cached on the vercel server, and since the server is nearby, this works quickly.
However, caching is too much responsibility for a translation library. The next task was to overhaul the caching logic to combine the next.js API, libraries, and solve common problems. As a result, another library was created — next-impl-cache-adapter.
Cache management
As already mentioned, for a common cache between instances (replicas, copies) — the cache must be separate from each instance of the application. next-impl-cache-adapter solves this by creating a separate service.
This service is a server in which the desired cacheHandler works. Each application instance will process requests through this server. At the same time, the server does not need to be restarted with each build. Outdated data will be automatically deleted during the launch of a new version of the application.
Server code:
// @ts-check
const createServer = require('next-impl-cache-adapter/src/create-server');
const CacheHandler = require('next-impl-cache-in-memory');
const server = createServer(new CacheHandler({}));
server.listen('4000', () => {
console.log('Server is running at <http://localhost:4000>');
});
In this example, the server is passed next-impl-cache-in-memory — this is a basic cacheHandler that saves data in-memory.
A special adapter for working with the cache is configured in the application itself:
// cache-handler.js
// @ts-check
const AppAdapter = require('next-impl-cache-adapter');
const CacheHandler = require('next-impl-cache-in-memory');
class CustomCacheHandler extends AppAdapter {
/** @param {any} options */
constructor(options) {
super({
CacheHandler,
buildId: process.env.BUILD_ID || 'base_id',
cacheUrl: 'http://localhost:4000',
cacheMode: 'remote',
options,
})
}
}
module.exports = CustomCacheHandler;
The created adapter is connected in the next.js configuration:
// next.config.js
module.exports = {
cacheHandler: require.resolve('./cache-handler.js'),
cacheMaxMemorySize: 0, // disable default in-memory caching
}
The package supports three caching options: local
, remote
and isomorphic
.
local
Standard solution. The cache is processed next to the application. It is convenient to use in development mode and on stages where the application is launched in one instance.
remote
The entire cache will be written and read on the created remote server. Convenient to use for applications launched in several replicas.
isomorphic
The cache operates next to the application, but also saves data to a remote server. Convenient to use during assembly, preparing the cache for the moment of launching application instances, but without spending resources on loading the cache from a remote server.
As a cacheHandler, it can be any cacheHandler supported by next.js. And vice versa, cacheHandlers from the package can be directly connected in next.js.
Conclusions
The App Router introduced a lot of very useful updates, but lost in convenience, predictability, and versatility. First of all, due to caching. After all, this is a task in which there is no and cannot be a universal solution. The ability to disable caching for a request and write your own cacheHandler solves most of the problems. However, memoization and caching in the client router remain out of control.
The next.js team itself is in no hurry to develop solutions for specific tasks. For this reason, since the release of the stable App Router, I continue to work on the implementation of packages that solve next.js problems. Along the way, telling about them in articles.
Let’s make the web not only faster, but also clearer.
Links
next-impl-cache — solutions for setting up caching in next.js.
next-impl-getters — implementation of server getters and contexts in React Server Components without switching to SSR.
next-impl-config — adding support for configuration for each possible next.js environment (build, server, client, and edge).
next-classnames-minifier — compression of classes to characters (.a, .b, …, .a1).
next-translation — i18n library, developed with consideration of server components and maximum optimization.