1. Blog >
  2. Tech
  3. Reducing Vercel usage: mastering Data Cache and Function execution
19 August 2024

Reducing Vercel usage: mastering
Data Cache and Function execution

You're staring at your Vercel invoice in amazement, aren't you? Rest assured, you're not alone in the face of soaring metrics such as “Data Cache”, “Function Duration” and “ISR”.

We've put together a Next.js optimization guide to help you reduce your Vercel usage without sacrificing performance. We'll let you in on all our secrets for otpimizing your costs.

Disabling page preloading

By default, Next.js preloads pages whose links are visible in the page viewed by a visitor, which can lead to massive calls to lambda functions to generate pages via ISR, when their Full Route Cache is disabled.

To overcome this, add prefetch={false} on your next/link <Link> components in strategic places on your site. You could, for example, keep prefetching on your site's main pages, such as the links in your navigation menu, by ensuring that their cache is properly configured.

This simple adjustment can significantly reduce the consumption of many Vercel metrics at the same time.

Data Cache Optimization

The Data Cache at Vercel and Next.js: what a fascinating subject! A real optimization treasure trove, often misunderstood and misused, sometimes even hated.

The Data Cache temporarily stores the results of queries used to generate the pages of a Next.js application, thus avoiding unnecessary requests to data providers, which are generally too slow, on each visit. But, an over-indulgent cache can quickly become a burden on your Vercel budget. That's where our expertise comes in.

Strategic deactivation of the fetch cache

Sometimes we need to fetch very large, unoptimized amounts of data from APIs, where only a small amount is needed to generate our Next.js pages.

In such cases, we recommend disabling the fetch cache via the cache: 'no-cache' option. Fetches from the App Router in versions of Next.js 13 and 14 are cached by default, but this is no longer the case since Next.js 15. This may seem counter-intuitive, but trust us. Then, manually select the essential data to be cached using the unstable_cache function.

1import { unstable_cache } from 'next/cache';
2
3const fetchHeavyData = async () => {
4  const result = await fetch('https://example.com/api/heavy-data', {
5    cache: 'no-cache',
6  });
7
8  // The body variable contains an huge amount of data.
9  const body = await result.json();
10
11  return {
12    items: body.items.map((item) => {
13      // We manually select the properties we need to
14      // reduce the size of the response.
15      return {
16        id: item.id,
17        title: item.title,
18        description: item.description,
19      };
20    }),
21  };
22};
23
24export const fetchData = unstable_cache(fetchHeavyData, ['fetchHeavyData'], {
25  // Here goes your caching strategy
26  revalidate: 3600,
27});

This method, although laborious, will enable you to considerably reduce your Data Cache consumption, in both Read and Write mode. Your wallet will thank you, and your users will be delighted. A win-win situation.

The disadvantages of the unstable_cache function

However, let's not get ahead of ourselves. The unstable_cache function, while promising, is not without its shortcomings. Its very name should arouse your suspicion: “unstable”, really? That's hardly reassuring for a function that's supposed to manage our precious data.

That's why we strongly recommend the use of the nextjs-better-unstable-cache library. This alternative, a veritable panacea for discerning developers, makes up for the shortcomings of its predecessor. In fact, its author Alfonsus Ardani has written an article on the subject, which we invite you to consult to deepen your understanding.

Optimizing your Data Cache is a balancing act. You have to juggle performance and economy, speed and frugality. But with the right tools and a little practice, you'll soon become a true caching virtuoso with Next.js on Vercel.

The power of “force-static”: when and how to use it properly

The introduction of App Router with Next.js 13 seems to have turned the whole concept of caching on its head. But at its core, it retains the same codes as the previous Pages Router system and its well-known getStaticProps and getServerSideProps functions, which are found in many other web frameworks.

Make sure you use the export const dynamic = 'force-static' parameter for your App Router pages as soon as possible, to get the maximum benefit from Full Route Cache for your web pages. To do this properly, be sure to configure your cache revalidation rules, otherwise this configuration could prove absolutely ineffective:

Revalidation time: finding the right balance

Fine-tune the revalidation time for each of your API calls using the revalidate parameter. Determine the right value to optimize your Vercel infrastructure consumption and your visitors' experience.

However, don't expect miraculous results from increasing the cache revalidation time of your routes and fetches. Our experience has shown that this optimization has little impact on Vercel usage. It's true that extending this delay can reduce the load on your servers. But it can also make your site less responsive to updates.

Revalidation using a webhook: synchronization with your Headless CMS

Now let's talk about a technique that's far more optimized, but no more sophisticated for all that: revalidation (or purging your cache) via a webhook.

The idea is simple: as soon as a modification is published in your Headless CMS, for example, a webhook is triggered to purge all fetches linked to the CMS API. It's efficient, and avoids having obsolete content on your site, without having to decide on a revalidation deadline yourself.

Here's an example of an endpoint you could set up:

1// app/api/revalidate/storyblok/route.js
2
3import { revalidateTag } from 'next/cache';
4import { NextResponse } from 'next/server';
5
6export const GET = async (request) => {
7  const { searchParams } = new URL(request.url);
8
9  if (searchParams.get('token') !== process.env.REVALIDATE_SECRET) {
10    return NextResponse.json({ message: 'Invalid token' }, { status: 401 });
11  }
12
13  revalidateTag('storyblok');
14
15  return NextResponse.json({ message: 'Success' });
16};

Make sure to include the 'storyblok' tag in all your fetch api calls via the next.tags parameter:

1await fetch('https://api.storyblok.com/v2/cdn', {
2  next: {
3    tags: ['storyblok'],
4  },
5});

Finally, from the control panel of the external service - Storyblok in this example - set up a webhook leading to your newly created endpoint, including the secret token in the URL search param.

Function Duration: avoiding the pitfalls

As Vercel is a serverless platform, we need to be vigilant about our use of their infrastructure, because every second counts. The Function Duration metric measures the execution time of our serverless functions, which can sometimes quickly spiral out of control. Also worth bearing in mind: Next.js' ISR (Incremental Static Regeneration) functionality also consumes these precious seconds.

Defeating infinite loops: a lambda's nightmare

Just imagine: your functions get carried away and run endlessly. It's the ultimate disaster scenario. Although we never think of allowing infinite loops to be created in our project, they can sometimes be created by external libraries, for example.

There's only one thing we recommend to avoid any problems: setting the Next.js maxDuration parameter. By setting a maximum duration, we create protective barriers for our functions. By default, the maximum execution time for Vercel functions is 10 or 15 seconds, depending on your plan. These are probably far too many seconds to generate pages for your website (or so we hope).

Avoid panic among lambdas with @storyblok/js

Some SDKs for external services useful for your Next.js project include retry functionality. This is particularly true of the @storyblok/js library.

Our advice? Configure your SDKs with optimal retry strategies for your project.

For example, add this small setting when initializing the Storyblok client via the storyblokInit function: apiOptions.maxRetries = 0. This will avoid frantic request attempts when the rate-limit is reached, as excessive consumption will no longer be possible. Your site will regain its composure, and you, your serenity.

Our Next.js expertise at your service

We sincerely hope these tips will help you tame your Vercel consumption!

However, keep in mind that every Next.js or React project is unique, with its own challenges and peculiarities. If you're feeling overwhelmed, or simply want to benefit from an expert eye, don't hesitate to call on our Next.js experts for an in-depth audit. We'll be delighted to put our expertise to work on your web project.

praiz
swile
alan
aircall
côté nature
libeo
fleet
praiz
swile
alan
aircall
côté nature
libeo
fleet
praiz
swile
alan
aircall
côté nature
libeo
fleet
praiz
swile
alan
aircall
côté nature
libeo
fleet
  • Command failed with exit code 137 Netlify
    Tech
    Fix Netlify error “Command failed with exit code 137”

    Occasionally, a Next.js site is unable to be deployed on Netlify and encounters the error “Command failed with exit code 137” on each new attempt. This often confusing error is generally due to a lack of memory.

    But rest assured, we've got 3 solutions to solve the problem.