Skip to main content
Deno 2 is finally here 🎉️
Learn more
banner
GitHub release (latest by date) GitHub last commit GitHub GitHub Repo stars GitHub issues

Deno Deno

A light-weight caching solution for GraphQL with Deno

Table of Contents

What is DenoCacheQL?

With DenoCacheQL, a developer can quickly and easily cache their GraphQL queries on their Redis server for more efficient queries. The DenoCacheQL playground allows a developer to test their GraphQL queries, receiving back the responses from their queries, the response times, and the response source (database or cache). We’ve also included a graph of the response time for easy latency visualization.

📖 Getting Started 📖

How to set up the DenoCacheQL

https://deno.land/x/denocacheql

To set up your server to use DenoCacheQL:

  • Import DenoCacheQL, your resolvers, and your typeDefs.
  • Make sure the redis server is up and running.
  • Create a new instance of DenoCacheQL.
  • Configure the server to use DenoCacheQL routes.

Add this line to your deno.config (or tsconfig.json)

 "include": ["dql_playground/**/*"],

Example set up:

//import 
import  DenoCacheQL  from 'https://deno.land/x/denocacheql'
import {resolvers, typeDefs} from "./schema.ts" 

//creating a new instance
const dc = new DenoCacheQL({
  typeDefs,
  resolvers, 
  redisInfo: {
    hostname: HOST_NAME,
    port: PORT,
    password: OPTIONAL_PASSWORD,
  }
})

//using DC routes
app.use(dc.routes());
app.use(dc.allowedMethods());

//exporting to use in your resolver logic
export { dc };

How To Implement Caching Functionailty

Once you’ve imported the module and created your DenoCacheQL instance, you’ll be able to access the DenoCacheQL functions from the context argument. Because DenoCacheQL implementation is modular, you can choose the specific resolvers in which you want to use the caching functionality. One easy way to implement the caching functionality is by wrapping your resolver logic as a callback inside the DenoCacheQL cache function, as demonstrated below.

const resolvers = {
  Query: {
    myQuery: async (parent, arg, context, info) => {
    return await context.dc.cache({parent, arg, context, info}, async() => {
       //put your resolver logic here
       ....
      })
    }
  },

You can also destructure cache from context.dc.

const resolvers = {
  Query: {
    myQuery: async (parent, arg, context, info) => {
    const {dc} = context 
    return await dc.cache({parent, arg, context, info}, async() => {
       //put your resolver logic here
       ...
      })
    }
  },

How to clear the cache

If you would like to mutate your data and clear the cache at the same time so that incorrect data doesn’t remain in the cache, DenoCacheQL provides a flush function. Call this function whenever you would like to clear the cache, or you may use the redis.flushall() in the redis terminal.

Mutation: {
  myMutation: async (parent, arg, context, info) => {
    //put your mutation logic here
    ...
     await context.dc.flush()
    })
  }

Using the Front-End Playground

Testing Queries and Making Mutations

Animation of Front-End Query

To use the front-end playground use the URL endpoint /graphql.

We made the front-end playground as intuitive as possible by allowing developers to input queries and mutations with the same syntax they expect from GraphQl. After submitting a query, the returned response will be displayed to the right of the query.

We built the bottom half of the playground to visualize the caching times. Each query response is stored in the table and is recorded with the source of the data coming back (either the database or the cache), as well as the latency. A chart is also rendered and updated with each query to give a full picture of the efficiency of the cache.

🔮 Future Plans 🔮

  • Add client-side caching
  • Add an option for cache expiration
  • Expand functionality of the playground to include
    • Button to clear the query field
    • Button to clear the cache
    • Add tab functionality in the query field

Want to Contribute?

We welcome contributions to our open-source project, feel free to make a PR or reach out to us via LinkedIn. Happy Coding!

Engineer Team

  • Jessica Balding
  • Han Li
  • Regina Kwan
  • Michelle Hindt

⚠️ Reporting Issues ⚠️

We are currently in beta and listening for any feedback and issues you may run into. If you are experiencing any difficulty with this module, please open a GitHub Issue. Thank you for your patience and ongoing support! 🙏