Making my blog faster with Upstash Serverless Redis

How I use Upstash to cache my Notion based blog data for faster load times.

05 February 2022

Blog cover

I mentioned in my previous post that I was utilising Upstash to cache my blog related data to reduce latency on my site. My blog posts are structured and edited on a Notion database - which I fetch via the Notion API. It was apparent to me from the early stages of development that the Notion API was just a little too slow for my needs. I wanted to speed things up a bit.

Upstash

Upstash provides serverless Redis capabilities that can be utilised by edge functions, as stated here:

Edge functions: Edge computing (Cloudflare workers, Fastly Compute) is becoming a popular way of building globally fast applications. But there are limited data solutions accessible from edge functions. Upstash Global Database is accessible from Edge functions with the REST API. Low latency from all edge locations makes it a perfect solution for Edge functions

Since I am using Remix deployed as an edge function on Vercel - Upstash seemed the perfect fit.

Remix

As much as I want to dive into the Remix side of things, I’d rather focus on the core pieces that utilise the Upstash capabilities. This is a simple flow diagram illustrating how the blog data is fetched:

Upstash provides a JavaScript SDK called upstash-redis - which I am planning on integrating in future. In the meantime, I just rolled with their REST API which does the trick.

For reusability, you’d typically want to wrap the REST API calls in utility functions. When another use case for Redis comes along, you can simply use these functions with different arguments. In Remix, we’d plop these utilities in a file called upstash.server.ts. The .server extension explicitly tells Remix that we don’t need whatever is imported/exported from this file on the client, so they will be removed from the client bundle. This is how that file would look:

const upstashUrl = process.env.UPSTASH_URL

const headers = {
  Authorization: `Bearer ${process.env.UPSTASH_TOKEN}`,
  Accept: "application/json",
  "Content-Type": "application/json",
}

export async function get(key: string) {
  const response = await fetch(`${upstashUrl}/get/${key}`, { headers })
  try {
    const { result } = await response.json()
    return JSON.parse(result).data
  } catch (error) {
    return null
  }
}

export async function set(key: string, data: any, expiry = 100000) {
  const response = await fetch(`${upstashUrl}/set/${key}?EX=${expiry}`, {
    method: "post",
    body: JSON.stringify({ data }),
    headers,
  })
  try {
    const result = await response.json()
    return result
  } catch (error) {
    return null
  }
}

export async function deleteKey(key: string) {
  const response = await fetch(`${upstashUrl}/del/${key}`, {
    method: "post",
    headers,
  })
  try {
    const result = await response.json()
    return result
  } catch (error) {
    return null
  }
}

Just a couple of things to note here:

To utilise these function for blogs, I create another set of light wrappers (specific to blogs) around the utilities in blogs.server.ts:

import { deleteKey, get, set } from "~/utils/upstash.server"

export async function getAllBlogs() {
  return get("blogs:all")
}

export function setAllBlogs(data: any) {
  return set("blogs:all", data)
}

export function deleteAllBlogs() {
  return deleteKey("blogs:all")
}

export function getBlog(slug: string) {
  return get(`blog:${slug}`)
}

export function setBlog(slug: string, data: any) {
  return set(`blog:${slug}`, data)
}

export function deleteBlog(slug: string) {
  return deleteKey(`blog:${slug}`)
}

Then, on a single blog route (app/routes/blog/$slug.tsx), I perform the cache lookup in the loader as follows:

export const loader: LoaderFunction = async ({ request, params }) => {
  const { slug } = params

  if (!slug) throw new Response("Slug not provided", { status: 404 })

  let post = await getBlog(slug)

  if (post) {
    return json({ post }, { status: 200 })
  } else {
    // Fetch Notion data and transform into post
    await setBlog(slug, post)
    return json({ post }, { status: 200 })
  }
  throw new Response("Post not found", { status: 404 })
}

This way my loader will always check for the blog post in Redis first. Since I do not expect the posts to change very often, I set the expiry of the cached entries to 24 hours.

Cache Invalidation

In the cases where I do edit my blog posts, whether it be content, new posts, or archived posts, I would want that to reflect on my site immediately. The cached data will still be returned on the site, so I needed a way to bust this cache on demand.

Ideally - I would like for Notion to be my source of truth, where changes to specific fields, or content itself, triggers an event that can invalidate all or some of the blog posts. I don’t have this functionality just yet, so in the meantime I have setup resource routes in Remix that allow me to invalidate specific cache entries. Now, if I want a new blog post to appear, or changes to reflect, I will simply hit these endpoints, navigate to the post, and the cache entry will be rebuilt for that article. It’s not perfect, but it works for now.

In Closing

Adding Upstash Redis has observably reduced the latency on my blog posts (against which I will start collecting metrics), and I highly recommend it for an easy-to-use solution for any data that needs to be delivered to the user faster.

Edit

I forgot to mention that we do have access to the Cache-Control header, which instructs the browsers and/or CDNs to cache the client requests and server responses. In Remix, we’d just add this to the headers option passed to the returned json function call in the loader:

const headers = {
  "Cache-Control": "private, max-age=3600",
}
...
return json({ post }, { status: 200, headers })

This instructs the browser (private) to cache the request/response for an hour. While this further boosts performance (by decreasing load time) - we do not have control over this cache as we do with Redis.

P.S Remix is awesome 💿