Skip to content

feat: implement cacheForRequest() per-request factory cache#646

Open
JamesbbBriz wants to merge 1 commit intocloudflare:mainfrom
JamesbbBriz:feat/cache-for-request
Open

feat: implement cacheForRequest() per-request factory cache#646
JamesbbBriz wants to merge 1 commit intocloudflare:mainfrom
JamesbbBriz:feat/cache-for-request

Conversation

@JamesbbBriz
Copy link

Summary

Implements cacheForRequest() from RFC #623 — a per-request factory cache that lazily initializes values on first call and returns the cached result for subsequent calls within the same request.

API

import { cacheForRequest } from "vinext/cache-for-request";

const getPrisma = cacheForRequest(() => {
  const pool = new Pool({ connectionString: env.HYPERDRIVE.connectionString });
  return new PrismaClient({ adapter: new PrismaPg(pool) });
});

// In any server code:
const prisma = getPrisma(); // first call creates, subsequent calls reuse

Key design decisions:

  • Factory function reference = cache key (no string keys, no cross-module collision)
  • Async factories cache the Promise itself (concurrent-safe: two await getDb() share one invocation)
  • Outside request scope: factory runs every time, no caching (safe for tests/build-time)

Changes

File Change
shims/cache-for-request.ts New — cacheForRequest() implementation (17 lines of logic)
shims/unified-request-context.ts Add requestCache: WeakMap<Function, unknown> field + shallow-clone comment sync
package.json Add "./cache-for-request" export

Blast radius

Zero. New file + one new field with new WeakMap() default (~0.1μs per request). No existing behavior changes.

Test plan

Tested with Prisma v7 + Hyperdrive on Workers. Happy to add unit tests — let me know the preferred test file location.

Addresses #623 · Relates to #537

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: f2f1d15111

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +55 to +57
if (!isInsideUnifiedScope()) {
// Outside request scope: no caching, run factory every time.
return factory();

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Cache in execution-context request scopes

cacheForRequest() disables caching whenever isInsideUnifiedScope() is false, but Pages Router middleware runs in a request scope without unified ALS (runMiddleware() only wraps _runWithExecutionContext, and prod server currently calls runMiddleware(webRequest, undefined)). In that path, repeated calls within a single request always re-run the factory, so this API does not provide the advertised per-request behavior for middleware/server code on Pages Router production flows.

Useful? React with 👍 / 👎.

@JamesbbBriz
Copy link
Author

Re: middleware scope — this is intentional. Caching without a proper request scope would risk leaking state across requests, which is worse than re-running the factory. Pages Router middleware is the only path without unified scope; wrapping it would be a separate middleware pipeline change. The fallback (factory runs every time) is safe and predictable.

Comment on lines +24 to +27
"./cache-for-request": {
"types": "./dist/shims/cache-for-request.d.ts",
"import": "./dist/shims/cache-for-request.js"
},
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can this be made available through vinext/cache?

@james-elicx
Copy link
Collaborator

/bigbonk review

@pkg-pr-new
Copy link

pkg-pr-new bot commented Mar 23, 2026

Open in StackBlitz

npm i https://pkg.pr.new/vinext@646

commit: 6a92c4d

Copy link
Contributor

@ask-bonk ask-bonk bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review: cacheForRequest() per-request factory cache

Clean implementation — the core logic is simple and correct. A few things to address before merge:

Must-fix

  1. Missing resolve.alias entry in the Vite plugin. The documented import path (vinext/cache-for-request) won't resolve during dev because it's not in the alias map at index.ts:1653-1663. The package.json exports only take effect for production builds / external consumers. During dev, Vite uses resolve.alias, and all other vinext/* internal shims are registered there. Without this, importing from vinext/cache-for-request in user code during vite dev will fail to resolve.

  2. Missing tests. The PR description says "happy to add unit tests" — they should be included before merge. The existing tests/unified-request-context.test.ts is the natural home. At minimum:

    • Factory runs every time outside unified scope (no caching)
    • Factory runs once per request inside unified scope
    • Different factories get different cached values within the same request
    • Different requests get independent cached values
    • Async factory: the Promise is cached (concurrent await shares one invocation)
    • Nested scopes (runWithUnifiedStateMutation) share the same cache
  3. Maintainer feedback on export path. @james-elicx asked if this can be vinext/cache instead of vinext/cache-for-request. That conflicts with the existing next/cache shim at shims/cache.ts (mapped via ./shims/* wildcard), so it would need the file renamed or a re-export. This needs resolution before merge.

Nits

  1. The requestCache field is not asserted in the existing getRequestContext defaults test (unified-request-context.test.ts:28-44). Even without the new test file, the existing defaults test should be updated to verify ctx.requestCache is a WeakMap.

  2. WeakMap<Function, unknown> — the Function type is the global Function interface which is very broad. Consider WeakMap<WeakKey, unknown> (since that's what WeakMap actually requires) or WeakMap<(...args: any[]) => any, unknown> for slightly tighter typing. Minor — the runtime behavior is fine either way.

* @module
*/

import { getRequestContext, isInsideUnifiedScope } from "./unified-request-context.js";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This import uses a relative path (./unified-request-context.js), which is correct for the shims directory. However, the Vite plugin's resolve.alias map (index.ts:1653-1663) does not include a "vinext/cache-for-request" entry.

All other vinext/* shims are registered in that alias map so they resolve correctly during dev. Without this entry, import { cacheForRequest } from "vinext/cache-for-request" in user code will fail during vite dev.

Needs a corresponding entry:

"vinext/cache-for-request": path.join(shimsDir, "cache-for-request"),

in the alias map alongside the other vinext/* entries.


// ── cache-for-request.ts ──────────────────────────────────────────
/** Per-request cache for cacheForRequest(). Keyed by factory function reference. */
requestCache: WeakMap<Function, unknown>;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: Function is the broad global interface. Since WeakMap keys just need to be objects, and the actual usage is always () => T factory functions, a narrower type would be slightly more precise:

Suggested change
requestCache: WeakMap<Function, unknown>;
requestCache: WeakMap<(...args: any[]) => any, unknown>;

Minor — the runtime behavior is identical.

* @param factory - Function that creates the value. Called at most once per request.
* @returns A function with the same return type that caches the result per request.
*/
export function cacheForRequest<T>(factory: () => T): () => T {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The implementation is clean and correct. One thing to consider: if a factory throws, the error is not cached — subsequent calls will re-invoke the factory. This is actually good behavior (retry-on-error), but it's worth documenting explicitly since the async case differs subtly: a rejected Promise will be cached (because cache.set(factory, value) stores the Promise before it rejects).

For async factories that can fail, users would get a permanently cached rejected Promise within that request. Consider whether you want to handle this:

const value = factory();
if (value instanceof Promise) {
  // Self-healing: delete from cache if the promise rejects
  value.catch(() => cache.delete(factory));
}
cache.set(factory, value);

This is a design decision rather than a bug — either behavior is defensible. But the current JSDoc says "the returned Promise is cached, so concurrent await calls within the same request share one invocation" without mentioning the rejection case.

"types": "./dist/index.d.ts",
"import": "./dist/index.js"
},
"./cache-for-request": {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@james-elicx asked if this should be vinext/cache instead. Note that vinext/shims/cache already resolves (via the ./shims/* wildcard) to the next/cache shim (shims/cache.ts). So "./cache" as an export key would be ambiguous or would shadow it.

Options:

  1. Keep ./cache-for-request (current) — clear, no ambiguity
  2. Re-export cacheForRequest from shims/cache.ts so it's available as import { cacheForRequest } from "next/cache" or vinext/shims/cache
  3. Use ./cache but rename the file to avoid confusion

This needs a decision from the maintainer before merge.

@ask-bonk
Copy link
Contributor

ask-bonk bot commented Mar 23, 2026

Review submitted. Here's a summary of what I flagged:

Must-fix (3 items)

  1. Missing resolve.alias entryvinext/cache-for-request is not in the Vite plugin's alias map (index.ts:1653-1663), so the documented import path will fail during vite dev. All other vinext/* shims are registered there.

  2. Missing tests — The PR description acknowledges this. I outlined the minimum test cases: outside-scope behavior, per-request caching, factory isolation, cross-request isolation, async Promise caching, and nested scope sharing.

  3. Export path decision — The maintainer (@james-elicx) asked if this should be vinext/cache instead of vinext/cache-for-request. This conflicts with the existing next/cache shim, so it needs a design decision.

Design consideration

  1. Rejected Promise caching — Async factories that throw will have the rejected Promise cached for the rest of the request, with no self-healing. Sync factory errors are fine (not cached). I suggested an optional catch-based cleanup pattern and noted this should at minimum be documented.

Nits

  1. The existing getRequestContext defaults test should assert the new requestCache field.
  2. WeakMap<Function, unknown> could use a narrower type than the broad Function interface.

github run

Implements the cacheForRequest API discussed in cloudflare#623.

API: import { cacheForRequest } from 'vinext/cache'

- Factory function reference is the cache key (no string collision)
- Async factories cache the Promise; rejected Promises clear their entry
  so subsequent calls within the same request can retry
- Outside request scope: factory runs every time (no caching)
- Public barrel at src/cache.ts, separate from the next/cache shim
- Vite dev alias registered in index.ts
@JamesbbBriz JamesbbBriz force-pushed the feat/cache-for-request branch from f2f1d15 to 6a92c4d Compare March 24, 2026 06:57
@JamesbbBriz
Copy link
Author

Addressed all feedback:

  • vinext/cache — public entry is now vinext/cache via a separate barrel (src/cache.ts), next/cache shim untouched.
  • Vite alias — added vinext/cache to the dev alias map in index.ts.
  • Rejected Promises — async factories now clear their cached entry on rejection (with identity check to avoid clearing a newer retry).
  • Type narrowingWeakMap<Function, unknown>WeakMap<(...args: any[]) => any, unknown>.
  • Tests — 6 tests covering: no-cache outside scope, same-request caching, separate factories, request isolation, nested scope sharing, async reject + retry.

@james-elicx
Copy link
Collaborator

/bigbonk review

Copy link
Contributor

@ask-bonk ask-bonk bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review: cacheForRequest() per-request factory cache

Clean implementation. The core design is sound — using function identity as cache key avoids string collision, the WeakMap ensures no memory leaks, and the rejected-Promise self-healing is a nice touch. The Vite alias entry was addressed. Tests cover the important scenarios.

Two things to fix before merge, plus a few nits.

Not in diff but should be addressed

The existing createRequestContext defaults test in tests/unified-request-context.test.ts:513-529 asserts every field — but doesn't assert the new requestCache field. Add expect(ctx.requestCache).toBeInstanceOf(WeakMap); after line 528 to keep the test exhaustive.

@@ -0,0 +1,105 @@
import { describe, it, expect, vi } from "vitest";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This imports from vitest, but all other test files in the repo import from vite-plus/test (the project uses Vite+ which bundles Vitest). See tests/unified-request-context.test.ts:1, tests/shims.test.ts:1, tests/routing.test.ts:1.

Suggested change
import { describe, it, expect, vi } from "vitest";
import { describe, it, expect, vi } from "vite-plus/test";

Comment on lines +68 to +75
const ctx = createRequestContext();
await runWithRequestContext(ctx, async () => {
const outer = get();
// Simulate a nested scope (e.g. runWithCacheState)
const inner = await runWithRequestContext(
{ ...ctx, dynamicUsageDetected: true },
() => get(),
);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test manually constructs { ...ctx, dynamicUsageDetected: true } and calls runWithRequestContext directly, but the real production code path for nested scopes goes through runWithUnifiedStateMutation. The shallow-clone sharing of requestCache is documented as intentional in unified-request-context.ts:140 — but the test doesn't exercise that code path.

Consider using runWithUnifiedStateMutation here instead, which is what the actual shims use:

Suggested change
const ctx = createRequestContext();
await runWithRequestContext(ctx, async () => {
const outer = get();
// Simulate a nested scope (e.g. runWithCacheState)
const inner = await runWithRequestContext(
{ ...ctx, dynamicUsageDetected: true },
() => get(),
);
const inner = await runWithUnifiedStateMutation(
(childCtx) => { childCtx.dynamicUsageDetected = true; },
() => get(),
);

This also means adding runWithUnifiedStateMutation to the import from unified-request-context.

// so subsequent calls within the same request can retry.
if (value instanceof Promise) {
cache.set(factory, value);
(value as Promise<unknown>).catch(() => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: the .catch() handler swallows the rejection silently. This is correct — the caller observes the rejection via the returned Promise (line 86). But someone reading this later might think the error is eaten. A brief comment would help:

Suggested change
(value as Promise<unknown>).catch(() => {
(value as Promise<unknown>).catch(() => {
// Clean up only — the caller still observes the rejection via the
// returned Promise. This just ensures a retry is possible.

/**
* Create a request-scoped cached version of a factory function.
*
* @param factory - Function that creates the value. Called at most once per request.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: JSDoc says "Called at most once per request", which is accurate for sync factories but not for async ones that reject (the cache clears, so the factory can be called again). Consider: "Called at most once per request for sync factories; async factories that reject are retried on next call."

@ask-bonk
Copy link
Contributor

ask-bonk bot commented Mar 24, 2026

Review submitted successfully. Here's a summary of what I flagged:

Must-fix (1 item)

  1. Wrong import source in test filetests/cache-for-request.test.ts imports from vitest but the repo standard is vite-plus/test. Every other test file uses the Vite+ bundled import. Suggested the fix.

Should-fix (2 items)

  1. Nested scope test doesn't exercise the real code path — The "shares cache across nested unified scopes" test manually constructs a child context with { ...ctx } + runWithRequestContext, but production nested scopes go through runWithUnifiedStateMutation (which does its own shallow clone). Suggested switching to runWithUnifiedStateMutation to test the actual path.

  2. Missing requestCache assertion in existing defaults testtests/unified-request-context.test.ts:513-529 asserts every field of createRequestContext() defaults except the new requestCache. Should add expect(ctx.requestCache).toBeInstanceOf(WeakMap).

Nits (2 items)

  1. Clarify .catch() intent — The rejection handler at line 75 could look like it swallows errors. Suggested a comment clarifying it's only for cache cleanup; the caller still observes the rejection.

  2. JSDoc accuracy for async retry — The @param says "Called at most once per request" which isn't strictly true for async factories that reject, since they retry after cache cleanup.

github run

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants