Skip to main content

Command Palette

Search for a command to run...

Offline Support in Web Apps: Data Prefetching Strategies

Published
8 min read
Offline Support in Web Apps: Data Prefetching Strategies

Deciding how to approach offline support can be challenging, especially once you move beyond the basics. So far, I focused on persisting data that was already fetched as a result of user actions. This works well for many cases. But in my experience, it quickly breaks down when users expect the app to “just work” offline — without thinking ahead.

This is the ninth post in my series about offline support in web applications. Today, I want to cover prefetching strategies: how to proactively prepare data for offline use, and more importantly, how to do it without hurting the user experience.

Why Prefetching Matters

If you only persist data that the user explicitly fetched, your offline support is technically correct — but practically limited.

The core issue is: users don’t know what will be available offline.

Imagine a todo app where a user opens a board while online, so the app fetches and stores its data. Later, they go offline and try to open an item from that board — something that, from their perspective, should just work. But since that specific item was never fetched before, it isn’t available, and the app feels inconsistent: the board is there, but the item isn't. It feels inconsistent.

Users don’t typically plan their navigation around connectivity, so in our example the app becomes unpredictable when offline. This leads to an important UX problem: a lack of trust. The user doesn’t know what will work. This inconsistency is probably worse than a full failure. At least with a full failure, the user understands the limitation.

Prefetching is a way to fix that. Instead of relying purely on user-driven fetching, we proactively prepare data ahead of time. Offline support becomes much more useful when the system, not the user, does the planning.

Step 1 — Decide When Prefetching Happens

First, we need to choose how we want to handle data prefetching for users.

Automatic Prefetching

The app automatically gets data in the background without the user doing anything. At first, this sounds ideal — and in many cases, it is. But there’s a catch. If you’re not careful, prefetching competes for resources in a critical moment: the first interaction.

Imagine this:

  1. The user opens the app.

  2. The UI renders.

  3. The user starts interacting with the application.

  4. At the same time, 20 background requests start firing.

The user network requests are effectively competing for resources with prefetching requests. Even if everything is working, the app will feel sluggish. Interactions become less responsive. This doesn’t feel like a great first experience.

Automatic prefetching optimises for readiness, but often at the cost of responsiveness. In my experience, naive automatic prefetching often does more harm than good.

User-controlled Prefetching

The alternative is to give control to the user. This could mean a “Prepare for offline” button, a settings toggle or a specific flow (e.g. downloading a project). This avoids the performance problem entirely. But it introduces another one: discoverability.

Users need to:

  1. Know the feature exists.

  2. Understand when to use it.

  3. Remember to trigger it.

User-controlled prefetching optimises for performance, but often at the cost of adoption. In practice, this often means the feature is underused — or not used at all.

Hybrid Approach

In most cases, I find that a hybrid approach works best: some level of automatic prefetching combined with user-triggered actions for larger datasets.

This gives you a decent default behavior, with an escape hatch for power users, leading to more predictable performance characteristics. The key is to keep automatic prefetching conservative.

Step 2 — Introduce Prefetching in Your Data Layer

Once you’ve decided when to prefetch, the next step is making it possible in your data layer.

Here's an example with using React Query, which makes this quite straightforward. Typically, this is how you'd fetch a list of todos.

const fetchTodo = async (id: string) => {
  return request.get(`/todos/${id}`);
};

export const useTodo = (id: string) => {
  return useQuery({
    queryKey: queryKeys.todo(id),
    queryFn: () => fetchTodo(id),
  });
}; 

Here's what will allow us to prefetch the same data.

const fetchTodo = async (id: string) => {
  return request.get(`/todos/${id}`);
};

const todoQueryOptions = (id: string) => {
  return queryOptions({
    queryKey: queryKeys.todo(id),
    queryFn: () => fetchTodo(id),
  });
};

export const prefetchTodo = (
  queryClient: QueryClient,
  id: string,
) => queryClient.prefetchQuery(todoQueryOptions(id));

export const useTodo = (id: string) => {
  return useQuery(todoQueryOptions(id));
};

You now have a reusable query definition, a function to fetch data without rendering a component, and a clear separation between reading and preparing data.

Step 3 — Control the Impact of Prefetching

This is where most nuanced work happens. Prefetching is relatively easy to add — but hard to do well. If you take one thing from this post, it’s this:

Prefetching should never compete with the user.

Here are a few practical rules I try to follow.

Limit Concurrency

Firing dozens of requests at once is rarely a good idea. Instead, I recommend introducing a simple concurrency limit. Even a small cap (1–2 concurrent requests, possibly with delay in between) can significantly improve perceived performance.

Here’s a minimal example:

const MAX_CONCURRENT_PREFETCH = 3;

let active = 0;
const queue: (() => Promise<unknown>)[] = [];

const runNext = () => {
  if (active >= MAX_CONCURRENT_PREFETCH || queue.length === 0)
    return;

  const task = queue.shift();
  if (!task) return;

  active++;
  
  task().finally(() => {
    active--;
    runNext();
  });
};

export const schedulePrefetch = (task: () => Promise) => {   
  queue.push(task);
  runNext();
};

You can then wrap your prefetch calls:

schedulePrefetch(() =>
  queryClient.prefetchQuery(todoQueryOptions(id))
);

This is intentionally simple, but it already gives you much better control.

Prefetch When the App Is Idle

One of the simplest and most effective strategies is to only prefetch when the app isn’t busy. The browser gives you a useful primitive for this: requestIdleCallback.

const scheduleIdlePrefetch = (task: () => void) => {
  if ('requestIdleCallback' in window) {
    requestIdleCallback(() => task());
  } else {
    // Fallback
    setTimeout(() => task(), 1000);
  }
};

Usage:

scheduleIdlePrefetch(() =>
  schedulePrefetch(() =>
    queryClient.prefetchQuery(todoQueryOptions(id))
  )
);

This ensures that prefetching happens when the main thread is less busy, which helps preserve responsiveness.

Respect Network Conditions

Not all connections are equal, and prefetching blindly can be wasteful — or even harmful. You can use the navigator.connection API to make basic decisions:

const shouldPrefetch = () => {
  const connection =
    (navigator as any).connection ||
    (navigator as any).mozConnection ||
    (navigator as any).webkitConnection;

  if (!connection) return true;

  const slowTypes = ['slow-2g', '2g'];
  if (slowTypes.includes(connection.effectiveType)) {
    return false;
  }

  if (connection.saveData) {
    return false;
  }

  return true;
};

Usage:

if (shouldPrefetch()) {
  scheduleIdlePrefetch(() =>
    schedulePrefetch(() =>
      queryClient.prefetchQuery(todoQueryOptions(id))
    )
  );
}

This doesn’t need to be perfect — even basic checks can make a noticeable difference.

Be Selective About What You Prefetch

Not all data is worth prefetching. First, focus on high-probability user paths, small to medium payloads and data required for core flows. You can skip rarely accessed data and large datasets.

A good rule of thumb:

Prefetch what the user is likely to need next — not everything they might need.

Prioritise User-Initiated Requests

This is the last point I'll bring up. In the beginning of this section, I mentioned that user actions should always win. That means not saturating the network, avoiding blocking important requests and being ready to pause or deprioritise prefetching.

In practice, this is harder than it sounds. Doing this properly usually requires a centralised request handling mechanism, awareness of in-flight requests and the ability to cancel or deprioritise prefetch requests.

I won’t go deep into this here, but it’s worth calling out — this is where simple setups often hit their limits. If you reach this point, it’s usually a sign that prefetching should be treated as a first-class concern in your data layer — not just an add-on.

Takeaways

Prefetching is one of those things that sounds simple but has a lot of nuance.

Done well, it makes your offline experience feel seamless and reliable. Done poorly, it slows down your app for everyone and wastes resources.

To summarise:

  • Persisting fetched data is not enough for great offline UX

  • Prefetching improves predictability and trust

  • A hybrid approach usually works best

  • Control when and how prefetching happens — carefully

In the next part, I’ll focus on a more concrete (and often overlooked) piece of the puzzle — how to reliably serve images when there’s no network at all.

If you enjoyed the article or have a question, feel free to reach out on Bluesky! 👋

Further Reading and References