Using client side caches in loaders is a powerful feature that many overlook and today I am going to try to break down why you want this. I will also show you what the three main React frameworks do out of the box, and lastly how to achieve similar outcomes with a few more steps.
Why utilize client side caches
So why do we want this in the first place? First of all going back and forth to the server is the most expensive part of your app. If you can avoid making a roundtrip you want to do so. This dramatically affects how your web application feels to end users.
The problem with server-first loaders is that by default any page needs to go back to the server to get the latest data. Now in theory this is what you want right? Get the latest data for the page on every client side navigation. Well not quite. What if we could show the previous data for the page and refetch the data in the background.
Let’s create the most basic implementation of a call to the server for each framework.
Tanstack Start
import { createFileRoute, Link } from "@tanstack/react-router";
import { createServerFn } from "@tanstack/react-start";
const getCount = createServerFn({
method: "GET",
}).handler(async () => {
await new Promise(resolve => setTimeout(resolve, 3000));
return 5;
});
export const Route = createFileRoute("/demo/start/server-funcs")({
component: Home,
loader: async () => await getCount(),
});
function Home() {
const count = Route.useLoaderData();
return (
<div>
{count}
<p>
<Link to="/">Home</Link>
</p>
</div>
);
}
Remix
import { href, Link, useLoaderData } from "react-router";
export const loader = async () => {
await new Promise(resolve => setTimeout(resolve, 3000));
return 5;
};
export default function Home() {
const count = useLoaderData<typeof loader>();
return (
<div>
{count}
<p>
<Link to={href("/")}>Home</Link>
</p>
</div>
);
}
Next
import Link from "next/link";
const getCount = async () => {
await new Promise(resolve => setTimeout(resolve, 3000));
return 5;
};
export default async function Home() {
const count = await getCount();
return (
<div>
{count}
<p>
<Link href="/">Home</Link>
</p>
</div>
);
}
What are the defaults?
After you’ve coded out the above (or simply cloned down this repository), take a note of what happens.
First Page Load
On first page load all of the frameworks appear to do the same thing. They wait for 3 seconds before showing the returned count which in this case is 5.
Subsequent Navigations
Subsequent navigations is a different story. If you load the demo page, navigate away, and then back again what happens? Both Next and Remix do the same thing here and they block for 3 seconds before the page loads, but woah Tanstack shows up instantly. This utilizes client side caching.
Tanstack actually does do a fetch to the server you’ll notice but it will show the old page count and fetch in the background. Now this is what you want most of the time. I mean our count doesn’t change. We could even go a step further and specify a staleTime
of whatever we want and it won’t fetch for that amount of time. A configurable client side cache!
export const Route = createFileRoute("/demo/start/server-funcs")({
component: Home,
loader: async () => await getCount(),
staleTime: 300000, // 5 minutes
});
Solving this in the other frameworks
Remix
Now something similar can be done with Remix using a clientLoader
but it just requires more manual plumbing.
import {
href,
Link,
useLoaderData,
type ClientLoaderFunction,
} from "react-router";
export const loader = async () => {
await new Promise(resolve => setTimeout(resolve, 3000));
return 5;
};
let isInitialRequest = true;
const cache = new Map<string, any>();
export const clientLoader: ClientLoaderFunction = async ({ serverLoader }) => {
const cacheKey = "demo";
if (isInitialRequest) {
isInitialRequest = false;
const serverData = await serverLoader();
cache.set(cacheKey, serverData);
return serverData;
}
const cachedData = await cache.get(cacheKey);
if (cachedData) {
return cachedData;
}
const serverData = await serverLoader();
cache.set(cacheKey, serverData);
return serverData;
};
clientLoader.hydrate = true;
export default function Home() {
const count = useLoaderData<typeof loader>();
return (
<div>
{count}
<p>
<Link to={href("/")}>Home</Link>
</p>
</div>
);
}
Now this was actually easier than I thought and mostly taken straight out of the Remix docs. Still as you can see its more complicated.
This doesn’t quite replicate what we had with Tanstack because it doesn’t have SWR caching. You could do something similar with the remix-client-cache package.
import { href, Link, useLoaderData } from "react-router";
import { cacheClientLoader, useCachedLoaderData } from "remix-client-cache";
export const loader = async () => {
await new Promise(resolve => setTimeout(resolve, 3000));
return 5;
};
// Caches the loader data on the client
export const clientLoader = (args: ClientLoaderFunctionArgs) =>
cacheClientLoader(args);
clientLoader.hydrate = true;
export default function Home() {
const count = useCachedLoaderData<typeof loader>();
return (
<div>
{count}
<p>
<Link to={href("/")}>Home</Link>
</p>
</div>
);
}
Next
Next also has the ability to client side cache the RSC payloads using the staleTimes
option:
import type { NextConfig } from "next";
const nextConfig: NextConfig = {
experimental: {
useCache: true,
staleTimes: {
dynamic: 30,
static: 30,
},
},
};
export default nextConfig;
This will cache the RSC payload for 30 seconds when navigating to and from a page.
Note: This does not do SWR style caching
Another solution on the Next side is to use server side caches instead of client side ones. It could look something like the following:
import { unstable_cacheLife as cacheLife } from "next/cache";
const getCount = async () => {
"use cache";
cacheLife("default");
await new Promise(resolve => setTimeout(resolve, 3000));
return 5;
};
The problem with this approach is it requires specific infrastructure to make this work. This also still makes a roundtrip to the server but it will be cached and because of this return quicker.
Conclusion
I would love to hear what you think. Do you see the power of using client side caches in loaders, or will you stick with the other approaches?