Caching Strategy for RESTFUL API

by Solomon Eseme

.

Updated Thu Sep 05 2024

Caching Strategy for RESTFUL API

Caching Strategy for restful API and website performance of any web page is a significant factor. It can affect the user’s experience and affect the business if not considered and optimized correctly.

Research has shown that website load time and performance can heavily impact different factors such as SEO, engagement, and conversation rates.

According to Amazon: “1 second of load lag time would cost Amazon $1.6 billion in sales per year”.

Google also noted that “A lad time of 400ms results in a decrease of 0.44% traffic – In real terms this amounts to 440 million abandoned sessions/month and a massive loss in advertising revenue for Google.”

Google said that “An extra 0.5 seconds in each search page generation would cause traffic to drop by 20%.”

Now, that is massive:

Walmart also admitted that “When load times jump from 1 second to 4 seconds, conversations decline sharply. For every 1 second of improvement, we experience a 2% conversion increase.”

The data comes from this article on a medium written by Viki Green.

A wide range of factors can cause slow web performances, such as page layout, interface design, calls to action, and page content.

But we are most concerned about how fast is our Backend AdonisJS Restful API responding to the request sent by our users and the best caching strategies to improve it.

Prerequisites

Before continuing with this article, you should have previous experience with the following:

  1. Basic understanding of TypeScript and Node.js

  2. Basic understanding of Adonis.js

  3. Building APIs with AdonisJS

Goals of the Tutorial

I will show you the best practice in handling Caching in Adonisjs, discussing the best strategies for handling scalable Caching in your RESTFUL API.

  1. We will build a real-world app to demonstrate how to boost performance with a good Caching Strategy.

  2. You will learn how to build a high-performance API with Adonisjs.

  3. How to implement a Caching System in AdonisJS Restful API.

  4. We will show you how to choose the right Caching Strategy depending on the project.

  5. You will learn about the concept of Caching and how to scale a large App with it.

  6. Best practices in building performance-oriented Web apps with AdonisJS.

Before we dive in, if you’re a backend developer or looking at delving into this career path, join other developers to receive daily articles on backend development that will boost your productivity.

Create a simple AdonisJS REST API

In this article, we will continue and improve the performance of the Ticketing System API developed for this article on Build a Ticketing App with Adonis.js and Vue.js with Section.io.

You can read through the article to be up to date with the project we will be working on and improve the API’s response time with Caching. You can peek at what we built in the previous article to know what was created with the API.

Install AdonisJS

If you’re just getting started with AdonisJS 5, I recommend reading AdonisJS 5 tutorial: The Ultimate Guide and building a restful API with Adonis 5 to get to speed.

Or, you can run this command to have AdonisJS 5 installed.

npm init adonis-ts-app@latest ticketing-api

But we will continue with the Ticketing API we have already developed; you can clone the repository to get started.

Setting up Adonis Cache

After installing and setting up your AdonisJS 5 RestFul API project, it’s time to install and configure your caching package that will simplify the process of implementing our Caching Strategy.

Run the following command to install the Adonis Cache package.

npm i @kaperskyguru/adonis-cache

Next, run the below command to configure and set up the package.

node ace invoke @kaperskyguru/adonis-cache
//Or
node ace configure @kaperskyguru/adonis-cache

Next, open the .env file and add the following code.

CACHE_DRIVER=file

The default driver is set to file but you can read the documentation to understand the different drivers and how to configure each of them.

Implementing Caching Strategy for Restful API

There are many different caching strategies available. Depending on your application use case and data structure, you can develop different Custom Caching Strategies to fit perfectly with your project.

In this article, we will explore a few of the top available strategies and understand how and where to use each of them.

  1. Cache Aside(Lazy Loading)

  2. Read Through

  3. Write Through

  4. Write Back (Write Behind)

  5. Write Around

Cache Aside (Lazy Loading)

This is a trendy way of Caching. The database is sitting aside in this strategy, and the application requests data from the Cache Server first.

If the data is found, the request is returned to the Client with the data.

Otherwise, the request is forwarded to the Database Server for the data, and in return, the data is stored in the Cache Server for subsequent requests.

In this strategy, we need to note two important concepts: when the Cache Hit and Cache Miss.

When the data is found in the Cache Server, it is called Cache Hit, and when it is not found in the Cache Server, and the request is forwarded to the Database Server, it is called Cache Miss.

We can create an event and attach a listener to be triggered when any of these happens.

Show an implementation


public async lazyLoadingStrategy(key: string, minutes: number, callback: Function): Promise<any> {
  if (await Cache.has(key)) {
      const data = await Cache.get(key)
      Event.hit(key, data, minutes);
      return data
  } else {
      Event.miss(key, [], minutes)
      // Database Server is called outside the Cache Server.
      const data = await callback()
      await Cache.set(key, data, minutes)
      return data
  }
}

Looking at the implementation, you will notice we are trying to get the data from our Cache Server, and if found, we trigger the Event.hit().

Next, if the key is not found, we search for the data from our Database Server using the callback() function and trigger the Event.miss().

And lastly, we store the data on our Cache Server for subsequent requests.

Read Through

This is a direct opposite Caching Strategy with the Cache Aside Strategy, and in this strategy, the Cache Server sits between the Client Request and the Database Server.

When a request comes in, it goes directly to the Cache Server. If there is a miss, the Cache Server is responsible for retrieving the data from the Database Server, updating itself for subsequent requests, and returning the data to the Client.

Show an Implementation.


public async readThrough(key: string, minutes: number): Promise<any> {
      const data = await Cache.get(key, minutes)
      return data
}

private async get(key, minutes){
    const data = await Cache.find(key)
    if(data){
      Event.hit(key, data, minutes);
      return data
    }

    Event.miss(key, [], minutes)
    // Database Server is called from the Cache Server.
    const DBdata = await Database.find(key)
    await Cache.set(key, DBdata, minutes)
    return DBdata
}

At a glance, we can see that the Read Through Strategy is very similar to the Lazy Loading Strategy we discussed above. The only difference there is the Cache Server is responsible for calling the Database Server on any Cache Miss.

Write Through

This Write Through Strategy is very similar to the Read Through Strategy. The Cache Server sits in between requests and the database server.

Every Write Operation must go through the Cache Server before going to the Database Server.

Show an implementation.


public async writeThrough(key: string, data: any, minutes: number): Promise<any> {
    const cacheData = await Cache.set(key, data, minutes)
    return cacheData
}

private async set(key: string, data: any, minutes: number){
    await Cache.set(key, data, minutes)
    // Database Server is called from the Cache Server.
    await Database.create(data)
    return data
}

From the implementation, we can see that it’s similar to the Read Through Strategy used for the Write Operations.

We can as well improve the implementation by checking and updating the Cache Server with the new Data if the key exists on the Cache Server.

Write Back 

This strategy is a more advanced use of the Write Through Strategy, and it can also be called Write Behind. It has the same structure set up where every write operation goes through the Cache server before going to the Database Server, but there is a delay when writing the data to the Database Server in the case of Write Back.

So a Client can send in 5 write requests, and it will store it all in the Cache Server, and the Cache Server will only flush all the updated data to the Database Server in every 1 minute (or more) interval.

Show an implementation.

const durationToFlush: number = 1; (in minute)
const tempDataToFlush: Array = [];

  public async writeThrough(key: string, data: any, minutes: number): Promise<any> {
      const cacheData = await Cache.set(key, data, minutes)
      return cacheData
  }

  private async set(key: string, data: any, minutes: number){
      await Cache.set(key, data, minutes)
      this.storeForUpdates(data)
      return data
  }

// Stores new data to temp Array for updating
  private storeForUpdates(data: any){
    const tempData = {}
    tempData['duration'] = this.getMinutesInMilli();
    tempData['data'] = data
    tempDataToFlush.push(data)
  }

// Converts minutes to millisecond
private getMinutesInMilli(){
  const currentDate = new Date()
  const futureDate = new Date(currentDate.getTime() + this.durationToFlush * 60000)
  return futureDate.getTime()
}

// Calls to update the Database Server.
public async updateDatabaseServer(){
  if(this.tempDataToFlush){
    this.tempDataToFlush.forEach((obj, index) => {
      if(obj.duration <= new Date().getTime()){
        if(await Database.create(obj.data)){
            this.tempDataToFlush.splice(index, 1)
        }
      }
    })
  }
}

Set up a Cron Job to run every minute to update the Database Server by calling the updateDatabaseServer() method. You can read how to set up Cron Jobs in different ways with Laravel Cron: The Definitive Guide article.

So far, the implementation is a bit long and tedious, and also much imperfect code design. In implementing a production-ready Write Back Strategy, the tempDataToFlush should be redesigned to use proper data structure.

So far, the above code is an untested implementation of the Write Back Strategy. Let me know if you notice any bugs and how to improve the implementation.

Write Around

This strategy combines both Cache Aside and Read Through Strategies together. In this strategy, all Write Operations goes directly to the Database Server, and only Read operation updates the Cache Server.

For example, if a user wants to create a new Post, the Post stores directly to the Database Server, and when the user wants to read the Post’s content for the first time.

The Post is gotten from the Database Server and stores in the Cache Server for subsequent requests before returning the Post content to the user.

Show an implementation.

public async writeAround(key: string, data: any, minutes: number): Promise<any> {
    const storedData = await Database.create(data)
    await Cache.set(key, data, minutes)
    return storedData
}

public async readOperation(key: string, minutes: number){
    const cacheData = await Cache.lazyLoadingStrategy(key, minutes)
    return cacheData
}

The Write Around Strategy combines different strategies and can be customized to fit the project and the operations perform on the data.

We have utilized the lazyLoadingStrategy() for the read operations, Write Operations goes straight to the Database Server before updating the Cache Server.

We have not mentioned many Caching Strategies for a restful api, and you can even use a custom Caching Strategy that best suits your project requirements.

Handling Caching Adonisjs 5 Restful API

After setting up Adonisjs 5 project by cloning our previous project and installing the Caching Package.

We are ready to dive right into the code:

We will use the Cache Aside Strategy for all the Read operations with Write Around and Write Through Strategy for the Write Operations.

Open your Controller file from our previous project, and we already have the TicketsController file found in app/Controllers/Http folder.

First, we will look at the Read Operations:

import { HttpContextContract } from '@ioc:Adonis/Core/HttpContext'
import Ticket from 'App/Models/Ticket'
const Keygen = require('keygen')
import Cache from '@ioc:Kaperskyguru/Adonis-Cache'
export default class TicketsController {

// Reading all Tickets available
public async index({}: HttpContextContract) {
const tickets = await Cache.remember('tickets', 60, async function () {

  // This code is only executed if there was a MISS
  return await Ticket.query().preload('user').preload('event')

})
return tickets
}

// Reading a Single Ticket by ID
public async show({ params }: HttpContextContract) {
  try {
    const ticket = await Cache.remember('ticket_id_' + params.id, 60, async function () {
    // This code is only executed if there was a MISS
      return await Ticket.find(params.id)
    })
  
    if (ticket) {
      await ticket.preload('user')
      await ticket.preload('event')
      return ticket
    }
  } catch (error) {
    console.log(error)
  }
}

The Read operations are largely shown in the index and show methods responsible for retrieving all tickets and a single ticket by ID, respectively.

The Cache.remember function from the Cache Package implements the Cache Aside strategy for us out of the box.

From the code above, we try to read the content from our Cache Server first. If there is a HIT, we return the ticket.

Otherwise, if there is a MISS, we continue to retrieving the Ticket with the closure function.

Here is the implementation of the Cache.remember function to understand how it works under the hood.

public async remember(name: string, minutes: number, callback: Function): Promise<any> {
    if (await this.has(name)) {
        return await this.get(name)
    } else {
        const data = await callback()
        await this.set(name, data, minutes)
        return data
    }
}

Next, we will look at the Write Operations:

public async update({ request, params }: HttpContextContract) {

  // Tries to find a Ticket from Cache, if found returns the Ticket
  const ticket = await Cache.remember('ticket_id_' + params.id, 60, async function () {
    // If not found, Retrieves from Database and Save to Cache.
    return await Ticket.find(params.id)
  })


  if (ticket) {
    ticket.amount = request.input('amount')
    if (await ticket.save()) {

      // If updates successfully in database, then updates the Cache Server.
      await Cache.update('ticket_id_' + params.id, ticket, 60)
    
      await ticket.preload('user')
      await ticket.preload('event')
      return ticket
    }
    return // 422
  }
  return // 401
}


public async store({ auth, request }: HttpContextContract) {
  
  const user = await auth.authenticate()

  //Saves a new Ticket to Database
  const ticket = new Ticket()
  ticket.code = Keygen.hex(5)
  ticket.eventId = request.input('event_id')
  ticket.amount = request.input('amount')
  await user.related('tickets').save(ticket)
  
  // Stores a new Ticket to Cache
  await Cache.set('ticket_id_' + ticket.id, ticket, 60)

  return ticket
}

The Write Operations first go to the Database before updating the Cache server during the Read Operation or immediately to speed up the Read operations subsequently.

Testing the Performance

Now that we have implemented our Caching strategy for a restful API in our project. Let’s run some tests and see how the performance of our API is improved.

We will compare both the previous API without Caching and with Caching to see the difference in seconds.

The request without a Caching System results in a 643ms response time.

Caching Strategy for RESTFUL API

Next, we will look at the same request with a Caching System in place,

Caching Strategy for RESTFUL API

At a glance, we can see a significant improvement in the response time of the two results. The request with a Caching System results in a 12ms response time.

Even though the requests are still fast and less than a second, it results from benchmarking a local Database Server. Also, the Payload is less than three collections.

Further Reading

You can learn more about Caching from the articles below:

  1. Things You Should Know About Database Caching.

  2. Caching strategies from AWS.

  3. How to implement Caching in AdonisJS 5.

Conclusion

Caching should be an integral part of the development process if we need to achieve good performance in any project.

Caching Strategy for AdonisJS RestFul API shows exactly how to implement these different Caching Strategies in your AdonisJS projects.

Whenever you're ready

There are 4 ways we can help you become a great backend engineer:

The MB Platform

Join 1000+ backend engineers learning backend engineering. Build real-world backend projects, learn from expert-vetted courses and roadmaps, track your learnings and set schedules, and solve backend engineering tasks, exercises, and challenges.

The MB Academy

The “MB Academy” is a 6-month intensive Advanced Backend Engineering BootCamp to produce great backend engineers.

Join Backend Weekly

If you like post like this, you will absolutely enjoy our exclusive weekly newsletter, Sharing exclusive backend engineering resources to help you become a great Backend Engineer.

Get Backend Jobs

Find over 2,000+ Tailored International Remote Backend Jobs or Reach 50,000+ backend engineers on the #1 Backend Engineering Job Board

Backend Tips, Every week

Backend Tips, Every week