Supabase Part 4: Monorepo & Interoperable Backend

This is the third article in a series on Supabase, focusing on solutions to common problems, tips, and recommendations.

In this post, I will share tips and tricks about edge functions compatible nodejs backend and developer productivity.

Using Monorepo

Most people nowadays use monorepos for their JavaScript projects, especially in the frontend. Using Supabase in a monorepo can be tricky due to edge functions running on the Deno runtime, but it’s entirely possible. There are a few key things to keep in mind to ensure a smooth monorepo experience.

Deno and Editor Support

While major IDEs support Deno, it still doesn’t match Node.js tooling due to Deno’s module resolution architecture. Deno downloads and caches imported modules and dependencies for autocomplete and type checking, but this process can be slow and annoying sometimes. It also doesn’t integrate well with a monorepo setup when importing shared “node” libraries.

For example, even though VSCode has strong support and can enable Deno for a specific path, it struggles with code imported from outside. It detects all imports as Deno-related and tries to lint and autocomplete based on Deno features. This creates a conflict since the same files are also used in Node.js apps. Worse, cross-project refactoring is broken because VSCode fails to resolve linked dependencies. Renaming a variable in shared libraries, for example, may not rename it in Supabase functions.

The solution to this problem, is to disable Deno in the editor and let it detect all files as nodejs scripts. Deno has almost out of the box support for node builtins nowadays, and keeps getting better at compatibility. Most npm packages can be imported through esm.sh or npm:xxx format. If there is an issue with Deno, it will be caught during function deployment or local testing. If you need more guarantee, you can always write unit tests and run them with both Node.js and Deno.

💡
const isDeno = typeof Deno !== “undefined” can be used to detect if the code is running on Deno runtime.

Of course, using Deno imports directly in this setup isn't possible and it is not a good practice anyway. Module imports should be abstracted using an import map, and the package aliases should match their npm counterparts. This enables the consistent use of imports like import * from "@package/subpackage" across different runtimes, ensuring that your code remains unified and consistent, regardless of whether you're working with Deno or Node.js.

// import_map.json located in supabase/functions folder
// different ways of importing packages:
{
  "cohere-ai": "npm:cohere-ai",
  "@google/generative-ai": "https://esm.sh/@google/generative-ai@0.21.0"
  "lodash": "https://esm.sh/lodash@4.17.21",
  "openai": "https://deno.land/x/openai@v4.67.1/mod.ts",
  "openai/resources": "https://deno.land/x/openai@v4.67.1/resource.ts",
  "openai/streaming": "https://deno.land/x/openai@v4.67.1/streaming.ts",
  "fs": "node:fs",
  "path": "node:path"
}

// package.json for installing the equivalent libraries to nodejs
{
    ...,
    "dependencies": {
      "cohere-ai": "^7.14.0",
      "@google/generative-ai": "^0.21.0",
      "lodash": "^4.17.21",
      "openai": "^4.67.1"
    }
}
💡
Importing via esm.sh is similar to using the npm: format. The latter is newer and supports all kinds of npm packages, whereas some packages—though rarely—may not work with esm.sh. As of writing this article, the only downside to using the npm: format is that packages are bundled differently, leading to a larger final size for the edge function.

Using Shared Libraries

Supabase edge runtime supports importing files outside the supabase folder. Any file can be imported using import maps, same as npm packages as mentioned earlier. For example;

// import_map.json
// then relative path must start from the directory of import_map.json 
{
  ...
  "my-shared-lib/": "../../../libs/my-shared-lib/src/"
  ...
}

// ts config (e.g. tsconfig.base.json in projet root dir)
{
  "compilerOptions": {
    ...
    "paths": {
      "my-shared-lib/*": ["libs/my-shared-lib/src/*"]
    }
  }
}

// then import anywhere in the application
import { ... } from "my-shared-lib/path/to/file.ts"

Interoperable Backend

A monorepo setup with shared code, import maps, and avoiding Deno-specific code can create new opportunities. By designing the backend to be decoupled from Deno and edge functions, the same code can run in a traditional Node.js application with frameworks like Express.js or NestJS. This approach allows for a cost-effective method of redundancy and scaling, and an alternative to edge functions.

For instance, imagine you have an edge function that queries a database and sends an email. If this function is part of a shared library, decoupled from the edge runtime, it can be used in both edge functions and a custom Node backend. You could choose to run it solely in the edge, in the backend, or distribute it across both environments, optimizing resource utilization. Since Supabase edge functions offer a generous number of free calls as part of their subscription, it makes sense to leverage them as much as possible. Additionally, being able to seamlessly switch between runtimes enhances resilience and future-proofs the app if there are issues with one of them, at some point.

💡
Note that database network restrictions do not apply to edge functions on cloud. If you intend to restrict network access to database, you must either use only supabase client on edge functions, or use a custom backend behind a whitelisted ip.

Conclusion

I think, the right approach for developing with Supabase or any serverless backend, especially for medium to large-scale applications, is to adopt a monorepo structure with a custom backend and edge functions, sharing as much code and logic as possible. In addition to code reusability and consistency across different environments, it also ensures smoother scalability, and resilience.

Did you find this article valuable?

Support Arda's Notebook by becoming a sponsor. Any amount is appreciated!