Full stack TypeScript with Cloud Functions and React

Gareth Cronin
7 min readSep 22, 2023

--

I build small web applications with an ever-evolving stack and toolkit. In the time after I started writing about it a couple of years ago I’ve settled into:

  • JavaScript on Node.js for front end and back end
  • Google Cloud Functions for serverless APIs
  • Google Sheets for master data
  • Firebase Firestore for user data
  • AWS API Gateway, Route 53, CloudFront, S3, Lambda@Edge for the fundamentals
  • React with Material UI for responsive web single page applications
  • Github Actions for CI/CD

I’ve written plenty about why I made these choices, but in short: it’s mainly about only paying for cloud infrastructure when it’s being used, and minimising the learning curve and boilerplate required to build something solid and scalable.

I cut my teeth as a developer in a strongly typed object-oriented language (Java) and when I left that world behind and started using vanilla ECMA I loved the freedom of bashing out code without worrying about AbstractFactoryFactoryProvider. Over the years though, I’ve come to appreciate that TypeScript is a pretty elegant half-way ground where the benefits of strong typing can be gained, without the pain!

In the last couple of projects I’ve built, I’ve used TypeScript at the front end. In my most recent project I decided it was time to do the same thing at the back end and get the benefits of sharing the types all the way up the stack.

Google Cloud Functions with TypeScript and Express

Although AWS is where I am most comfortable, I prefer Google Cloud Platform’s Cloud Functions over AWS Lambda. The cold start time is shorter, so apps with Cloud Functions in the middle tier tend to be snappier, even when usage is low. I also find the developer experience much more productive — the lightweight Functions Framework makes it easy to build and test functions locally with confidence that they will work when deployed. It’s easy to add the power of Express to the hosted functions. It’s also easy to use Firestore in admin mode with the default Firebase service account.

GCP provides a guide on setting up with TypeScript as the language for Cloud Functions. The TypeScript support is added by including a target to run the TypeScript compiler tsc in the package.json and a “main” entry point in the dist directory where tsc puts the transpiled source:

{
"name": "functions",
"main": "dist/index.js",
// ...
"scripts": {
"build": "tsc",
"start": "functions-framework --target=v1 --source=dist/functions/src",
"deploy": "gcloud functions deploy v1 --runtime nodejs18 --trigger-http"
"prestart": "npm run build",
"gcp-build": "npm run build",
},
// ...
}

That’s enough for the Cloud Functions deploy process to work. The targets above will let you run a local version of an exported function called “v1” with “start”, and a deployed version with “deploy”.

By adding Express to one exported function (in my case: v1), the URL scheme can routed with Express’s pattern matching and Express middleware can be used to handle CORS and authentication (Google OAuth2 here) based on the URL. Here’s an abbreviated version of my index.ts:

import express from "express";
const { OAuth2Client } = require('google-auth-library');
const app = express();
const oauthClient = new OAuth2Client();

app.use((req: express.Request, res: express.Response, next: express.NextFunction) => {
res.header('Access-Control-Allow-Origin', '*');
res.header('Access-Control-Expose-Headers', 'Content-Range');
res.header('Access-Control-Allow-Headers', 'Origin, X-Requested-With, Content-Type, Accept, Authorization');
res.header('Access-Control-Allow-Methods', 'GET,PUT,POST,DELETE,OPTIONS');
next();
});

// middleware to check user is authenticated by validating the bearer token
app.use('/:user/**', async (req: express.Request, res: express.Response, next: express.NextFunction) => {
if (req.method === 'OPTIONS') {
next();
return;
}
const authHeader = req.headers.authorization;
if (!authHeader) {
res.status(401).send('Unauthorized');
return;
}
const token = authHeader.split(' ')[1];
if (!token) {
res.status(401).send('Unauthorized');
return;
}

const ticket = await oauthClient.verifyIdToken({
idToken: token,
audience: GOOGLE_CLIENT_ID,
});
const payload = ticket.getPayload();
const userid = payload['email'];
if (userid) {
if (req.params.user !== userid) {
console.log('user is', req.params.user, 'and token user is', result);
res.status(403).send('Unauthorized: bearer is not the requested user');
return;
}
next();
}
else {
res.status(401).send('Unauthorized');
}
});

app.get('/:user/transactions', async (req: express.Request, res: express.Response) => {
const user = req.params.user;
//...

exports.v1 = app;

Adding testing with Jest

When I build back ends I like to do it with strict TDD. I find it a lot faster to write a broken test and work to fulfil it’s contract, and only then to plug it into the HTTP endpoint, than to attempt to do it the other way around. There’s a handy NPM package to make it easy to set up TypeScript testing for Jest:

npm install --save-dev ts-jest
npx ts-jest config:init
npm i --save-dev @types/jest
npm i --save-dev @types/node

The tsconfig.json also needs to be tweaked to include the types:

{
"compilerOptions": {
//...
"types": [
"node",
"jest",
"ts-jest"
],
},
//...
}

I put my tests in a tests directory alongside src. It’s worth adding the tests directory to .gcloudignore so they don’t get pushed to the Cloud Functions source directory on a deployment.

Sharing types with the front end

To keep things simple, I added all the types to a single source file. This project deals with accounting codes for transactions. Here’s an excerpt from it:

export interface Transaction {
id: string;
date: Date;
description: string;
amount: number;
code?: string;
fsid?: string;
manual: boolean;
};

//...

The pattern I’ve adopted in the back end is to return a result/error wrapper with from the business and database logic, then unpack that and return its JSON representation from the HTTP GET request (or vice versa, using the types in application/json bodies for POST and PUT request).

Here’s an excerpt from the business and persistence layer:

export const getTransactions = async (user: string):
Promise<{ result?: Transaction[], error?: FirestoreError }> => {
try {
const db = admin.firestore();
// query the users collection to find where the userid property is equal to the value of :user
const userRef = db.collection('users').where('userid', '==', user);

//...

return { result: transactionsWithDatesAndIds };
}
else {
log('User exists with no transactions');
return { result: [] };
}
}
else {
log('User does not exist');
return { error: { code: 'usernotfound' } };
}
} catch (err: any) {
return { error: handleFirestoreError(err) };
}
};

And here’s the corresponding endpoint in index.ts:

app.get('/:user/transactions', async (req: express.Request, res: express.Response) => {
const user = req.params.user;
const transactions = await getTransactions(user);
if (transactions.error) {
if (transactions.error.code === 'usernotfound') {
res.status(404).send(transactions.error);
}
else {
res.status(500).send(transactions.error);
}
}
else {
res.send(transactions.result);
}
});

My first instinct was to put the types in a shared directory at the same level as my back end and front end source directories:

functions/
src/
-package.json
-tsconfig.json
shared
src/
-types.ts
web/
src/
-package.json
-tsconfig.json

I added includes to the tsconfig.json files in functions and web to reference types.js.

This worked fine on my own machine, but I ran into trouble when I tried to deploy to Google Cloud Functions. Cloud Functions copies everything in its root directory (in this case that is the functions directory) and then runs the build. There is no facility to include a directory that isn’t in the root. That’s a bit annoying, but the compromise is to leave the types in the back end source and link to them from the front end:

functions/
src/
-types.ts
-package.json
-tsconfig.json
web/
src/
-package.json
-tsconfig.json

The tsconfig.json on the front end (in web/src) then needs the include:

//...

"include": [
"src",
"../functions/src/types.ts"
],

//...

I used Vite rather than my past go-to of Create React App (CRA) on this project, and it happily pulls in the shared code for the build. I’ve been having trouble with dependency conflicts in CRA and Vite was trouble-free. It is a more performant and safer drop-in replacement and I’ll be using it as my default from here on out.

The client-side code that calls the API can then coalesce the parsed JSON from the HTTP request into the same type used on the server:

export const getAllTransactions = async (user: User): 
Promise<{ result?: Transaction[], error?: ApiError }> => {
try {
const apiResult = await fetch(`${API_URL}/${user.email}/transactions`, {
headers: {
Authorization: `Bearer ${user.token}`
}
});
if (!apiResult.ok) {
return handleApiError(apiResult);
}
const jsonResult = await apiResult.json();
return { result: jsonResult as Transaction[] };
}
catch (error) {
return handleApiError(error);
}
};

CI/CD

Github Actions has never let me down yet! I used one YAML for the back end and one for the front. Here’s the abbreviated versions:

name: Functions build

on:
push:
branches: [ main ]
paths:
- 'functions/**'
- '.github/workflows/functions.yml'
jobs:
deploy:
name: Deploy function
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- id: 'auth'
uses: 'google-github-actions/auth@v1'
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}
- name: Deploy function
uses: google-github-actions/deploy-cloud-functions@main
with:
name: v1
runtime: nodejs18
source_dir: functions
region: us-central1
name: Production web build

on:
push:
branches: [ main ]
paths:
- 'web/**'
- '.github/workflows/web-prod.yml'

jobs:
build:
runs-on: ubuntu-latest

strategy:
matrix:
node-version: [18.x]

steps:
- uses: actions/checkout@v2

- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}
- name: NPM install
run: |
npm install
working-directory: web
- name: Build
run: |
npm run build
working-directory: web
env:
CI: ""
- name: Deploy to S3
uses: jakejarvis/s3-sync-action@master
//...
- name: Invalidate Cloudfront cache
uses: muratiger/invalidate-cloudfront-and-wait-for-completion-action@master
//...

Conclusion

I found sharing the types between the front and back gives me a big step up in the ability to catch problems early — often just thanks to the linting in VS Code. It reduces the duplication in munging data and makes intellisense a lot more “intelli”. I won’t be going back!

--

--

Gareth Cronin
Gareth Cronin

Written by Gareth Cronin

Technology leader in Auckland, New Zealand: start-up founder, father of two, maker of t-shirts and small software products

No responses yet