Understanding and Implementing Next.js "use cache" Directive: A Deep Dive
Have you ever wondered how Next.js, particularly with React Server Components, magically caches your data using the use cache
directive? This powerful feature can significantly boost your application's performance by avoiding redundant computations and data fetching. In this guide, we'll dissect this concept, explore its underlying mechanisms, and then implement a simplified version from scratch to solidify your understanding.
Who is this guide for?
While we touch on beginner concepts, this guide dives deep into implementation details. It's best suited for developers with some Next.js and JavaScript experience who want to understand the "magic" behind use cache
.
What is the "use cache" directive?
The use cache
directive is a signal within React Server Components that tells Next.js to memoize (cache) the result of an async
function. Once a function marked with use cache
is executed with a specific set of arguments, its return value is stored. Subsequent calls to the same function with the same arguments during the same rendering pass will return the cached result instead of re-executing the function.
This is particularly beneficial for:
- Expensive computations: Operations that consume significant CPU time.
- Data fetching: Preventing multiple identical API calls within a single request-response lifecycle.
- Improving application performance: Reducing latency and server load.
- Ensuring data consistency: Multiple components accessing the same data within a render will get the exact same object.
Here's a canonical example:
// app/utils/data.js
export async function getPokemonDetails(pokemonName) {
'use cache'; // Instructs Next.js to cache the result of this function
console.log(`Workspaceing details for ${pokemonName} from API...`);
// This API call will only happen once per pokemonName per request,
// even if getPokemonDetails(pokemonName) is called multiple times.
const res = await fetch(`https://pokeapi.co/api/v2/pokemon/${pokemonName}`);
if (!res.ok) {
throw new Error(`Failed to fetch ${pokemonName}`);
}
return res.json();
}
If multiple components in your React tree call getPokemonDetails('pikachu')
during a single server render, the actual Workspace
will only occur once.
Core Caching Concepts in Next.js
While 'use cache'
provides basic memoization, Next.js (and the React caching APIs it builds upon) offers more advanced caching capabilities, often used in conjunction with Workspace
or custom caching solutions:
- Basic Caching with
'use cache'
: As discussed, memoizes function results within a single server rendering lifecycle. - Data Cache: Next.js extends the native
Workspace
API to automatically cache requests. This is a more persistent cache that can be shared across requests and deployments (depending on configuration and data store). - Tagged Caching & Revalidation: Allows you to associate cache entries with tags (e.g., using
next: { tags: ['myTag'] }
in aWorkspace
call) and then revalidate (invalidate) those entries on-demand (revalidateTag
) or based on time (revalidatePath
, time-based revalidation).
Our implementation will focus on mimicking the directive's behavior and some of these advanced features like tagging and time-to-live (TTL) for a custom cache solution.
How does 'use cache'
work conceptually?
Behind the scenes, the 'use cache'
directive isn't native JavaScript syntax. It's a convention that Next.js's build system processes. This directive often signals the use of a higher-order function (HoF) that wraps your original function, imbuing it with caching capabilities.
We can simulate this transformation using a package like directive-to-hof.
First, install the package:
npm install directive-to-hof
# or
yarn add directive-to-hof
# or
pnpm add directive-to-hof
Then, you'd create a transformer. This build-time script would find functions with 'use cache'
and wrap them:
// build-transformer.js
import { createDirectiveTransformer } from 'directive-to-hof';
// This transformer will convert our 'use cache' directive into actual caching code
const transformer = createDirectiveTransformer({
directive: 'use cache', // The directive string we're looking for
importPath: './cache.js', // Path to our custom caching implementation
importName: 'cacheWrapper', // The HOF that will provide caching
asyncOnly: true, // Ensure it only applies to async functions
});
// In a real build step, you'd apply this transformer to your source files.
// For example:
// const originalCode = "async function getData() { 'use cache'; ... }";
// const { contents } = await transformer(originalCode, { path: './app/utils/data.js' });
// console.log(contents);
When you write code like this in your-file.js
:
export async function getPokemon() {
'use cache';
const res = await fetch('https://pokeapi.co/api/v2/pokemon');
return res.json();
}
The transformer would output something like this (conceptually):
import { cacheWrapper } from './cache.js'; // Our custom caching logic
const getPokemonOriginal = async () => {
const res = await fetch('https://pokeapi.co/api/v2/pokemon');
return res.json();
};
export const getPokemon = cacheWrapper(getPokemonOriginal);
Understanding AsyncLocalStorage
Before diving into our cache implementation, we must grasp AsyncLocalStorage
. This Node.js API (node:async_hooks
) is crucial for maintaining context across asynchronous operations, especially in server environments like Next.js.
Think of AsyncLocalStorage
as a way to carry data implicitly through a sequence of asynchronous calls without explicitly passing it down as arguments. It's like having a request-specific "bag" of data.
- You
run
a function within a new context, initializing its store. - Any function (synchronous or asynchronous) called directly or indirectly within that
run
callback can access this store. - The store is isolated to that specific asynchronous execution flow.
Here's a simplified example:
import { AsyncLocalStorage } from 'node:async_hooks';
const storage = new AsyncLocalStorage();
async function logWithValue(message) {
const store = storage.getStore(); // Retrieve data from the current context
console.log(`${store.requestId}: ${message} - Value: ${store.value}`);
}
async function firstAsyncOperation() {
await logWithValue('Inside firstAsyncOperation');
// Simulate more async work
await new Promise((resolve) => setTimeout(resolve, 50));
}
async function secondAsyncOperation() {
await logWithValue('Inside secondAsyncOperation');
}
async function mainOperation(requestId) {
// Start a new context with initial data for this specific operation
await storage.run({ requestId, value: Math.random() * 100 }, async () => {
await firstAsyncOperation();
await secondAsyncOperation();
await logWithValue('Finished mainOperation');
});
}
// Simulate two concurrent operations
mainOperation('Request-A');
mainOperation('Request-B');
// Example Output (order might vary for Request-A/B blocks, but internal order is maintained):
// Request-A: Inside firstAsyncOperation - Value: <random_value_A>
// Request-B: Inside firstAsyncOperation - Value: <random_value_B>
// Request-A: Inside secondAsyncOperation - Value: <random_value_A>
// Request-B: Inside secondAsyncOperation - Value: <random_value_B>
// Request-A: Finished mainOperation - Value: <random_value_A>
// Request-B: Finished mainOperation - Value: <random_value_B>
We'll use AsyncLocalStorage
to track cache metadata (like tags and expiration times) associated with a specific call chain of our cached functions.
Implementing the Custom Cache System
Let's build our caching system step by step. Our system will reside in cache.js
.
Step 1: Creating a Cache Storage with Expiration (MapWithTTL
)
We need a storage mechanism that can automatically expire entries. A JavaScript Map
is a good start, but we'll extend it to handle Time-To-Live (TTL).
// cache.js (Partial 1)
class MapWithTTL extends Map {
set(key, valueWithOptions) {
// Expect valueWithOptions to be { data: any, ttl?: number }
const { data, ttl = Infinity } = valueWithOptions; // Default TTL is forever
let expirationTime = Infinity;
if (ttl !== Infinity && typeof ttl === 'number' && ttl > 0) {
expirationTime = Date.now() + ttl;
}
// Store the actual data along with its expiration timestamp
super.set(key, { data, expirationTime });
return this;
}
get(key) {
const entry = super.get(key);
if (entry) {
// Check if the entry has expired
if (entry.expirationTime <= Date.now()) {
this.delete(key); // Expired, remove it
return undefined;
}
return entry.data; // Still valid, return the data
}
return undefined; // Not found
}
has(key) {
// `get` handles expiration logic, so we can reuse it
return this.get(key) !== undefined;
}
}
Example Usage of MapWithTTL
:
const myCache = new MapWithTTL();
// Cache 'userData' for 5 seconds (5000 ms)
myCache.set('userData', { data: { name: 'Alice' }, ttl: 5000 });
// Cache 'config' indefinitely
myCache.set('config', { data: { theme: 'dark' } });
console.log(myCache.get('userData')); // { name: 'Alice' }
setTimeout(() => {
console.log(myCache.get('userData')); // undefined (if 5 seconds have passed)
console.log(myCache.has('userData')); // false
console.log(myCache.get('config')); // { theme: 'dark' } (still there)
}, 6000);
Step 2: Setting Up the Cache Context and Storage Instances
We'll use AsyncLocalStorage
for request-scoped metadata and our MapWithTTL
for the actual cache. We also need a way to map tags to cache keys.
// cache.js (Partial 2 - add this below MapWithTTL)
import { AsyncLocalStorage } from 'node:async_hooks';
import crypto from 'node:crypto'; // For generating unique IDs
// 1. Context Storage: Manages metadata (tags, TTL) for the current async flow
const cacheContext = new AsyncLocalStorage();
// 2. Main Cache: Stores the actual cached data using our TTL-enabled Map
const globalCache = new MapWithTTL();
// 3. Tag Mapping: Links tags to cache keys for invalidation
// A tag might map to multiple cache keys if different function calls use the same tag.
// So, a tag maps to a Set of cache keys.
const tagToCacheKeysMap = new Map(); // Map<string, Set<string>>
How they interact:
cacheContext
: When a wrapped function is called, we'llrun
it within a newcacheContext
store. Helper functions likecacheTag
will modify this store.globalCache
: StorescacheKey -> { data, expirationTime }
.tagToCacheKeysMap
: Storestag -> Set_of_cacheKeys
. When data is cached with a tag, we update this map. Wheninvalidate(tag)
is called, we use this map to find and delete entries fromglobalCache
.
Step 3: Implementing the Core Cache Wrapper (cacheWrapper
)
This higher-order function will contain the main caching logic.
// cache.js (Partial 3 - add this below storage instances)
export function cacheWrapper(fn) {
// Generate a unique ID for this specific function being wrapped.
// This helps differentiate caches if multiple functions have the same arguments.
const functionId = crypto.randomUUID();
const cachedFunction = async (...args) => {
// Create a unique cache key based on the function ID and its arguments.
// JSON.stringify is a common way, but has limitations (e.g., order of keys in objects, undefined).
// For robust solutions, consider more stable serialization.
const argumentsKey = JSON.stringify(args);
const cacheKey = `${functionId}:${argumentsKey}`;
// Initialize a context for this specific call.
// This store will be accessible by cacheTag and cacheLife within fn's execution.
const currentCallContext = {
tags: new Set(), // A call can have multiple tags
ttl: undefined, // Default TTL will be Infinity unless overridden by cacheLife
};
return cacheContext.run(currentCallContext, async () => {
// 1. Check if data is already in cache and not expired
if (globalCache.has(cacheKey)) {
console.log(`💾 Cache hit for key: ${cacheKey}`);
return globalCache.get(cacheKey);
}
// 2. If not in cache (cache miss), execute the original function
console.log(`🔍 Cache miss! Executing function for key: ${cacheKey}`);
const result = await fn(...args);
// 3. Store the result in the cache
// The `ttl` and `tags` would have been set by `cacheLife` and `cacheTag`
// called within `fn`'s execution, populating `currentCallContext`.
if (result != null) {
// Avoid caching null or undefined unless intended
globalCache.set(cacheKey, {
data: result,
ttl: currentCallContext.ttl, // Uses ttl from context, or MapWithTTL's default
});
console.log(
`📝 Cached result for key: ${cacheKey} with TTL: ${
currentCallContext.ttl || 'Infinity'
}`
);
// Link tags to this cache key
if (currentCallContext.tags.size > 0) {
currentCallContext.tags.forEach((tag) => {
if (!tagToCacheKeysMap.has(tag)) {
tagToCacheKeysMap.set(tag, new Set());
}
tagToCacheKeysMap.get(tag).add(cacheKey);
console.log(`🏷️ Tagged key ${cacheKey} with: ${tag}`);
});
}
}
return result;
});
};
return cachedFunction;
}
Step 4: Adding Cache Control Functions (cacheTag
, cacheLife
, invalidate
)
These functions will interact with the cacheContext
and our storage maps.
// cache.js (Partial 4 - add this below cacheWrapper)
// Call this INSIDE a 'use cache' function to associate tags with its result.
export function cacheTag(...tags) {
const store = cacheContext.getStore();
if (!store) {
// This can happen if cacheTag is called outside a cacheWrapper's execution scope.
// In a real Next.js/React environment, this might throw or be a no-op.
throw new Error(
'cacheTag called outside of a cached function context. Tags will not be applied.'
);
}
tags.forEach((tag) => store.tags.add(tag));
}
// Call this INSIDE a 'use cache' function to set a specific TTL for its result.
export function cacheLife(ttlInMilliseconds) {
const store = cacheContext.getStore();
if (!store) {
throw new Error(
'cacheLife called outside of a cached function context. TTL will not be applied.'
);
}
if (typeof ttlInMilliseconds !== 'number' || ttlInMilliseconds <= 0) {
throw new RangeError(
'Invalid TTL value for cacheLife. Must be a positive number.'
);
}
store.ttl = ttlInMilliseconds;
}
// Call this to invalidate cache entries associated with a specific tag.
export function invalidate(tagToInvalidate) {
const cacheKeysToInvalidate = tagToCacheKeysMap.get(tagToInvalidate);
if (cacheKeysToInvalidate && cacheKeysToInvalidate.size > 0) {
console.log(`🗑️ Invalidating cache for tag: ${tagToInvalidate}`);
cacheKeysToInvalidate.forEach((cacheKey) => {
globalCache.delete(cacheKey);
console.log(` - Deleted key: ${cacheKey}`);
});
// Remove the tag itself from the mapping as all its keys are gone
tagToCacheKeysMap.delete(tagToInvalidate);
} else {
console.log(`🤷 No cache entries found for tag: ${tagToInvalidate}`);
}
}
The Complete cache.js
Implementation
Let's assemble all the pieces into the final cache.js
file.
// cache.js (Complete)
import { AsyncLocalStorage } from 'node:async_hooks';
import crypto from 'node:crypto';
class MapWithTTL extends Map {
set(key, valueWithOptions) {
const { data, ttl = Infinity } = valueWithOptions;
let expirationTime = Infinity;
if (ttl !== Infinity && typeof ttl === 'number' && ttl > 0) {
expirationTime = Date.now() + ttl;
}
super.set(key, { data, expirationTime });
return this;
}
get(key) {
const entry = super.get(key);
if (entry) {
if (entry.expirationTime <= Date.now()) {
this.delete(key);
return undefined;
}
return entry.data;
}
return undefined;
}
has(key) {
return this.get(key) !== undefined;
}
}
const cacheContext = new AsyncLocalStorage();
const globalCache = new MapWithTTL();
const tagToCacheKeysMap = new Map(); // Map<string, Set<string>>
export function cacheWrapper(fn) {
const functionId = crypto.randomUUID();
const cachedFunction = async (...args) => {
const argumentsKey = JSON.stringify(args);
const cacheKey = `${functionId}:${argumentsKey}`;
const currentCallContext = {
tags: new Set(),
ttl: undefined,
};
return cacheContext.run(currentCallContext, async () => {
if (globalCache.has(cacheKey)) {
console.log(`💾 Cache hit for key: ${cacheKey}`);
return globalCache.get(cacheKey);
}
console.log(`🔍 Cache miss! Executing function for key: ${cacheKey}`);
const result = await fn(...args);
if (result != null) {
globalCache.set(cacheKey, {
data: result,
ttl: currentCallContext.ttl,
});
console.log(
`📝 Cached result for key: ${cacheKey} with TTL: ${
currentCallContext.ttl || 'Infinity'
}`
);
if (currentCallContext.tags.size > 0) {
currentCallContext.tags.forEach((tag) => {
if (!tagToCacheKeysMap.has(tag)) {
tagToCacheKeysMap.set(tag, new Set());
}
tagToCacheKeysMap.get(tag).add(cacheKey);
console.log(`🏷️ Tagged key ${cacheKey} with: ${tag}`);
});
}
}
return result;
});
};
return cachedFunction;
}
export function cacheTag(...tags) {
const store = cacheContext.getStore();
if (!store) {
throw new Error(
'cacheTag called outside of a cached function context. Tags will not be applied.'
);
}
tags.forEach((tag) => store.tags.add(tag));
}
export function cacheLife(ttlInMilliseconds) {
const store = cacheContext.getStore();
if (!store) {
throw new Error(
'cacheLife called outside of a cached function context. TTL will not be applied.'
);
}
if (typeof ttlInMilliseconds !== 'number' || ttlInMilliseconds <= 0) {
throw new RangeError(
'Invalid TTL value for cacheLife. Must be a positive number.'
);
}
store.ttl = ttlInMilliseconds;
}
export function invalidate(tagToInvalidate) {
const cacheKeysToInvalidate = tagToCacheKeysMap.get(tagToInvalidate);
if (cacheKeysToInvalidate && cacheKeysToInvalidate.size > 0) {
console.log(`🗑️ Invalidating cache for tag: ${tagToInvalidate}`);
cacheKeysToInvalidate.forEach((cacheKey) => {
globalCache.delete(cacheKey);
console.log(` - Deleted key: ${cacheKey}`);
});
tagToCacheKeysMap.delete(tagToInvalidate);
} else {
console.log(`🤷 No cache entries found for tag: ${tagToInvalidate}`);
}
}
Putting It All Together: A Complete Example
Let's create an example file (app.js
) that uses our caching system. This file would be the one processed by our hypothetical build transformer.
// app.js (This is the file you'd "compile" or run after transformation)
// For this example, we'll assume it's already transformed,
// so we'll import cacheWrapper directly for testing.
// In a real scenario with 'use cache', the transformer does this.
// For testing without the transformer, we manually import and wrap.
// If using the transformer, these imports are handled by it.
import { cacheWrapper, cacheTag, cacheLife, invalidate } from './cache.js';
// --- Original code that would have 'use cache' ---
async function _getPokemonData(pokemonName) {
// 'use cache'; // Directive would be here
// Use our cache control functions
cacheTag('pokemon', `pokemon-${pokemonName}`);
cacheLife(60 * 1000); // Cache for 1 minute
console.log(`Workspaceing ${pokemonName} data from API...`);
// Simulate API call
await new Promise((resolve) =>
setTimeout(resolve, 100 + Math.random() * 200)
);
return {
name: pokemonName,
id: Math.floor(Math.random() * 1000),
fetchedAt: new Date().toISOString(),
};
}
async function _getTrainerData(trainerId) {
// 'use cache';
cacheTag('trainer', `trainer-${trainerId}`);
cacheLife(5 * 60 * 1000); // Cache for 5 minutes
console.log(`Workspaceing trainer ${trainerId} data from API...`);
await new Promise((resolve) =>
setTimeout(resolve, 150 + Math.random() * 100)
);
return {
id: trainerId,
name: `Trainer ${trainerId}`,
team: [Math.random() > 0.5 ? 'pikachu' : 'charmander'],
fetchedAt: new Date().toISOString(),
};
}
// --- End of original code ---
// Manually wrap for this test since we're not running a full build transform
const getPokemonData = cacheWrapper(_getPokemonData);
const getTrainerData = cacheWrapper(_getTrainerData);
async function main() {
console.log('--- Scenario 1: Fetching Pikachu ---');
let pikachu = await getPokemonData('pikachu');
console.log('Fetched:', pikachu);
console.log(
'\n--- Scenario 2: Fetching Pikachu again (should be cached) ---'
);
pikachu = await getPokemonData('pikachu');
console.log('Fetched (cached):', pikachu);
console.log('\n--- Scenario 3: Fetching Charmander ---');
let charmander = await getPokemonData('charmander');
console.log('Fetched:', charmander);
console.log('\n--- Scenario 4: Fetching Trainer Ash ---');
let ash = await getTrainerData('Ash');
console.log('Fetched:', ash);
console.log('\n--- Scenario 5: Invalidating "pokemon-pikachu" tag ---');
invalidate('pokemon-pikachu');
console.log(
'\n--- Scenario 6: Fetching Pikachu again (should be a new fetch) ---'
);
pikachu = await getPokemonData('pikachu');
console.log('Fetched (after targeted invalidation):', pikachu);
console.log(
'\n--- Scenario 7: Fetching Charmander again (should be cached) ---'
);
charmander = await getPokemonData('charmander');
console.log('Fetched (cached):', charmander);
console.log('\n--- Scenario 8: Invalidating general "pokemon" tag ---');
invalidate('pokemon'); // This should invalidate Charmander too (and Pikachu if it was re-cached)
console.log(
'\n--- Scenario 9: Fetching Charmander again (should be a new fetch) ---'
);
charmander = await getPokemonData('charmander');
console.log('Fetched (after general invalidation):', charmander);
console.log(
'\n--- Scenario 10: Waiting for Pikachu to expire (1 minute) ---'
);
// Re-fetch Pikachu to get it into cache with its 1-min TTL
await getPokemonData('pikachu');
console.log('Pikachu re-cached. Waiting 65 seconds for TTL expiration...');
await new Promise((resolve) => setTimeout(resolve, 65 * 1000));
console.log('\n--- Scenario 11: Fetching Pikachu after TTL expiration ---');
pikachu = await getPokemonData('pikachu');
console.log('Fetched (after TTL):', pikachu);
}
main().catch(console.error);
To run this example:
- Save the complete
cache.js
code into a file namedcache.js
. - Save the
app.js
code above into a file namedapp.js
in the same directory. - Run
node app.js
from your terminal in that directory.
You will observe console logs demonstrating cache hits, misses, tagging, invalidation, and TTL expiration.
How to Use This with a Build Step (Conceptual)
In a real Next.js project, you wouldn't manually wrap functions. The 'use cache'
directive would be processed by the build system. To simulate this:
-
Create a build script (
build-app.js
):// build-app.js import { createDirectiveTransformer } from 'directive-to-hof'; import { readFile, writeFile, mkdir } from 'node:fs/promises'; import path from 'node:path'; const inputFile = process.argv[2]; // e.g., './src/app-source.js' if (!inputFile) { console.error('Please provide an input file.'); process.exit(1); } const outputDir = './dist'; const outputFilename = path.basename(inputFile); const outputFile = path.join(outputDir, outputFilename); const transformer = createDirectiveTransformer({ directive: 'use cache', importPath: '../cache.js', // Relative path from output file to cache.js importName: 'cacheWrapper', asyncOnly: true, }); async function build() { try { await mkdir(outputDir, { recursive: true }); const code = await readFile(inputFile, 'utf-8'); const { contents } = await transformer(code, { path: inputFile }); // Provide path for correct import resolution await writeFile(outputFile, contents, 'utf-8'); console.log(`Successfully transformed ${inputFile} to ${outputFile}`); console.log(`Run with: node ${outputFile}`); } catch (error) { console.error('Build failed:', error); } } build();
-
Create your source file with
'use cache'
(e.g.,src/app-source.js
): This would be similar toapp.js
but with_getPokemonData
actually containing'use cache';
and not being manually wrapped. ThecacheWrapper
import would be handled by the transformer.// src/app-source.js // Note: For this to work with the build script, cache.js should be in the root, // or importPath in build-app.js needs to be adjusted. // Let's assume cache.js is in the project root, and src/ is where app-source.js is. // Then importPath should be '../cache.js' as set in build-app.js. import { cacheTag, cacheLife, invalidate } from '../cache.js'; // These are still needed. async function getPokemonData(pokemonName) { 'use cache'; // The magic directive! cacheTag('pokemon', `pokemon-${pokemonName}`); cacheLife(60 * 1000); console.log(`Workspaceing ${pokemonName} data from API...`); await new Promise((resolve) => setTimeout(resolve, 100 + Math.random() * 200) ); return { name: pokemonName, id: Math.floor(Math.random() * 1000), fetchedAt: new Date().toISOString(), }; } // ... (rest of the main function and other data functions using 'use cache') // For brevity, imagine the full main() and getTrainerData() from the previous app.js here. // For a runnable example, copy the _getTrainerData and main function here, // renaming _getTrainerData to getTrainerData and adding 'use cache'. // Example main (simplified for this snippet) async function main() { let p = await getPokemonData('bulbasaur'); console.log(p); p = await getPokemonData('bulbasaur'); // should hit cache console.log(p); } main();
-
Add a script to
package.json
:{ "type": "module", // Important for using import/export "scripts": { "build": "node build-app.js ./src/app-source.js" }, "dependencies": { "directive-to-hof": "^1.0.0" // Or your installed version } }
-
Run the build and then the compiled file:
npm run build node ./dist/app-source.js
Key Takeaways & Considerations
'use cache'
is Syntactic Sugar: It relies on a build transformation to wrap functions with caching logic, often using a higher-order function.AsyncLocalStorage
is Crucial: For server-side rendering in Node.js environments, it enables context propagation through asynchronous operations, allowing functions likecacheTag
andcacheLife
to affect the correct cache entry.- Cache Key Generation:
JSON.stringify(args)
is simple but has limitations (e.g., object key order,undefined
values, functions, Symbols). More robust serialization might be needed for complex arguments. - Scope of Cache: Our
globalCache
is global to the Node.js process. In Next.js,'use cache'
is typically request-scoped memoization. The more persistent Data Cache (forWorkspace
) is different. For a true request-scoped cache like'use cache'
, you'd often clear or use a newMap
instance per request, or integrateAsyncLocalStorage
even more deeply into thecacheWrapper
to hold the cache store itself. - Advanced Features: We've implemented tagging, TTL, and invalidation, which are essential for managing cache effectively.
- Real-World Next.js/React Caching: The actual implementation in React (
cache
function) and Next.js is more deeply integrated with React's rendering lifecycle and server infrastructure. It handles request memoization by default. Our example builds a more generic caching utility inspired by it.
This deep dive provides a solid foundation for understanding how such caching directives can be implemented. While our version is simplified, it captures the core mechanics involved in directive-based caching, context management with AsyncLocalStorage
, and common caching features.