EdgeCases Logo
Mar 2026
Next.js
Deep
7 min read

Server Components Data Fetching Patterns

Parallel fetching with Promise.all, React.cache for deduplication, and Suspense boundaries to avoid request waterfalls in Server Components.

nextjs
server-components
data-fetching
parallel
react-cache
suspense
performance

Server Components promise zero client-side data fetching, but naive implementations create request waterfalls that are worse than the old Client Component approach. Parallel fetching, React.cache, and Suspense boundaries are the tools that make Server Components actually faster.

The Waterfall Problem

Write sequential awaits in a Server Component and you've recreated the classic N+1 query problem, just on the server:

// ❌ Sequential: 3 requests, 3× latency
export default async function DashboardPage() {
  const user = await getUser(); // 100ms
  const posts = await getPosts(user.id); // blocked, +100ms
  const comments = await getComments(); // blocked, +100ms

  // Total: 300ms
  return <Dashboard user={user} posts={posts} comments={comments} />
}

Each await blocks the next request from starting. This is the waterfall—requests that could run in series are forced to run sequentially because of how you've structured your code.

Solution 1: Initiate All, Await Together

Call your async functions first, then await them with Promise.all:

// ✅ Parallel: 3 requests, 1× latency
export default async function DashboardPage() {
  // Initiate all requests
  const userPromise = getUser();
  const postsPromise = getPosts();
  const commentsPromise = getComments();

  // Await all at once
  const [user, posts, comments] = await Promise.all([
    userPromise,
    postsPromise,
    commentsPromise,
  ]);

  // Total: ~100ms (longest request)
  return <Dashboard user={user} posts={posts} comments={comments} />
}

The key insight: fetch calls and database queries start when you call the function, not when you await. By storing the promises and awaiting them together, all requests run in parallel.

Solution 2: React.cache for Deduplication

Next.js 16 automatically deduplicates identical fetch requests within a render tree, but for database queries or external API calls, use React.cache:

// lib/user.ts
import { cache } from 'react';

export const getUser = cache(async (id: string) => {
  return db.users.findUnique({ where: { id } });
});

// Multiple components call this, only one query runs
export async function UserProfile({ userId }: { userId: string }) {
  const user = await getUser(userId); // Query runs once
  return <div>{user.name}</div>;
}

export async function UserSettings({ userId }: { userId: string }) {
  const user = await getUser(userId); // Cached, no duplicate query
  return <div>{user.settings}</div>;
}

React.cache memoizes the function result per request. Call getUser(id) ten times in the same request tree and it executes once. This replaces prop drilling data down—fetch in the component that needs it.

When Waterfalls Are Actually Fine

Sometimes data depends on data—serial fetching is unavoidable:

// Sequential by design
export default async function ArtistPage({ params }: { params: { id: string } }) {
  const artist = await getArtist(params.id); // Need ID first
  return (
    <>
      <h1>{artist.name}</h1>
      {/* Playlists need artist.id */}
      <Suspense fallback={<div>Loading...</div>}>
        <Playlists artistId={artist.id} />
      </Suspense>
    </>
  );
}

async function Playlists({ artistId }: { artistId: string }) {
  const playlists = await getArtistPlaylists(artistId);
  return <ul>{playlists.map(p => <li>{p.name}</li>)}</ul>;
}

Here's the pattern: fetch the parent data (artist), show it immediately, then stream in the dependent data (playlists). The <Suspense> boundary means the user sees content right away instead of staring at a loading spinner.

Suspense Boundaries for Streaming

Wrap slow data in <Suspense> to stream it in after the shell renders:

// Fast shell, slow components stream in
export default async function BlogPage() {
  return (
    <div>
      {/* Immediate */}
      <header>
        <h1>Blog</h1>
      </header>

      {/* Streams in ~500ms */}
      <Suspense fallback={<BlogListSkeleton />}>
        <BlogList />
      </Suspense>

      {/* Streams in ~800ms */}
      <Suspense fallback={<CommentsSkeleton />}>
        <LatestComments />
      </Suspense>
    </div>
  );
}

The browser receives the shell instantly. Then <BlogList /> arrives, then <LatestComments />. Each chunk streams in as soon as it's ready—no single slow request blocks the entire page.

The loading.js Pattern

loading.tsx is a file-based <Suspense> boundary:

// app/blog/loading.tsx
export default function Loading() {
  return <BlogSkeleton />;
}

// app/blog/page.tsx
export default async function BlogPage() {
  const posts = await getPosts(); // Slow: 500ms

  return (
    <ul>
      {posts.map(post => <li key={post.id}>{post.title}</li>)}
    </ul>
  );
}

Next.js automatically wraps page.tsx in a <Suspense> boundary with your loading.tsx fallback. The layout renders immediately, then the page streams in. This is route-level streaming—great for full-page data loads.

Gotcha: loading.js doesn't cover the layout. If your layout accesses cookies(), headers(), or uncached fetches, it blocks navigation. Wrap those accesses in their own <Suspense> boundaries.

Sharing Data with Context and React.cache

Pass data from Server to Client Components by combining React.cache with context:

// lib/user.ts
import { cache } from 'react';

export const getUser = cache(async () => {
  const res = await fetch('https://api.example.com/user');
  return res.json();
});

// components/user-provider.tsx
'use client';

export const UserContext = createContext<Promise<User> | null>(null);

export function UserProvider({ children, userPromise }: { children: React.ReactNode; userPromise: Promise<User> }) {
  return <UserContext value={userPromise}>{children}</UserContext>;
}

// app/layout.tsx
import { getUser } from '@/lib/user';
import UserProvider from '@/components/user-provider';

export default function RootLayout({ children }: { children: React.ReactNode }) {
  const userPromise = getUser(); // Don't await

  return (
    <html>
      <body>
        <UserProvider userPromise={userPromise}>{children}</UserProvider>
      </body>
    </html>
  );
}

// components/profile.tsx
'use client';

import { use, useContext } from 'react';
import { UserContext } from './user-provider';

export function Profile() {
  const userPromise = useContext(UserContext);
  if (!userPromise) throw new Error('Missing UserProvider');

  const user = use(userPromise); // Resolve the promise
  return <p>Welcome, {user.name}</p>;
}

The key: pass the promise, not the resolved data. Client components use React.use() to resolve it, wrapped in <Suspense> for the loading state. Multiple components access the same promise, and React.cache ensures the fetch happens once.

Cache Invalidation and Revalidation

Server Components cache at two levels: Next.js fetch caching and React.cache. Understand the difference:

// Next.js fetch caching (HTTP cache)
const data = await fetch('https://api.example.com/posts', {
  next: { revalidate: 3600 }, // Revalidate every hour
});

// React.cache (per-request memoization)
export const getPosts = cache(async () => {
  return db.posts.findMany();
});

// Revalidate with Server Actions or tags
'use server';

export async function createPost(formData: FormData) {
  await db.posts.create({ data: { title: formData.get('title') } });

  // Invalidate Next.js fetch cache
  revalidateTag('posts');
  revalidatePath('/blog');

  // React.cache is automatically cleared per request
  // No manual invalidation needed
}

next.revalidate controls the HTTP cache—requests with same URL and headers return cached responses until time-based revalidation or manual invalidation. React.cache is per-request memoization—automatically cleared after each request completes.

Error Handling with Error Boundaries

Server Component errors propagate to <Suspense> boundaries. Combine with error boundaries for graceful degradation:

// app/blog/error.tsx
'use client';

export default function Error({ error }: { error: Error }) {
  return (
    <div>
      <h2>Failed to load blog</h2>
      <p>{error.message}</p>
      <button onClick={() => window.location.reload()}>Retry</button>
    </div>
  );
}

// app/blog/page.tsx
export default async function BlogPage() {
  const posts = await getPosts(); // Might throw

  return <BlogList posts={posts} />;
}

Place error.tsx next to page.tsx. If getPosts() throws, the error boundary catches it and renders your fallback UI instead of crashing the page.

Performance Checklist

// Server Component data fetching audit:

[ ] Are independent requests parallel with Promise.all?
[ ] Are dependent requests wrapped in Suspense boundaries?
[ ] Is duplicate fetching eliminated with React.cache?
[ ] Is fetch caching configured (revalidate, tags)?
[ ] Are slow data streams in after the shell?
[ ] Are errors handled with error boundaries?
[ ] Is layout data fetching optimized (no blocking uncached fetches)?

The Golden Rule

Server Components eliminate client-side data fetching, but you must still think about request patterns. Parallel independent requests, stream dependent requests, and memoize duplicates. Do that, and Server Components are genuinely faster than the old approach.

Advertisement

Related Insights

Explore related edge cases and patterns

TypeScript
Expert
TypeScript Mapped Type Modifiers: When Inference Breaks
7 min
Next.js
Deep
Next.js 'use cache': Explicit Caching with Automatic Keys
9 min
Next.js
Deep
Next.js Parallel and Intercepting Routes
8 min

Advertisement