Next.js Performance: RSC, Streaming, and Caching Patterns
How Next.js (App Router) affects performance: server components, client boundaries, streaming, caching, and avoiding accidental JS bloat.
Frontend Interview Team
March 01, 2026
What you’ll learn
- Why Server Components can reduce client JS
- How client boundaries increase hydration cost
- Practical caching strategies in Next.js-style apps
30‑second interview answer
In Next.js App Router, Server Components let you keep rendering and data fetching on the server, which can reduce client JavaScript and improve INP. Performance issues often come from accidentally making large parts of the tree client-side ("use client"), heavy hydration, and poor caching. The goal is to keep client boundaries small, stream where possible, and cache data and pages appropriately.
The big lever: client boundaries
If you mark a top-level layout as client, you may force lots of client JS.
Rule: keep "use client" as low in the tree as possible.
Streaming (why it helps)
Streaming sends HTML progressively so users see content sooner.
It helps LCP if:
- above-the-fold content can render early
- slow data is deferred
Caching (principles)
- Cache at the edge when content is shared
- Revalidate rather than rebuilding everything
- Avoid caching personalized pages publicly
Common mistakes
- Too many client components
- Shipping large libraries for small UI features
- Fetching data on client when it could be server-side
Production rule of thumb
- Server-render by default.
- Minimize client boundaries.
- Measure hydration + long tasks.
Quick recap
- Next.js performance is mostly about JS shipped + hydration work.
- Server Components and streaming can improve real metrics.
- Caching strategy must match content type.
Performance checklist (copy/paste)
- Keep
"use client"boundaries as small as possible - Avoid shipping heavy libraries for small UI needs
- Stream/defer slow data when possible
- Add caching/revalidation where content is shared