WebAssembly Serverless Functions in Netlify

Aug 11, 2021 • 7 minutes to read

Netlify is a leading platform for developing and hosting Jamstack applications. In fact, the term Jamstack was coined by Netlify founder Mathias Biilmann in 2015. Netlify also hosts the annual Jamstack Conf.

A Jamstack application consists of a static UI (in HTML and JavaScript) and a set of serverless functions to support dynamic UI elements via JavaScript. There are many benefits to the Jamstack approach. But perhaps one of the most significant benefits is performance. Since the UI is no longer generated at runtime from a central server, there is much less load on the server and we can now deploy the UI via edge networks such as CDNs.

However, the edge CDN only solves the problem of distributing the static UI files. The backend serverless functions could still be slow. In fact, popular serverless platforms have well-known performance issues, such as slow cold start, especially for interactive applications. That's where WebAssembly could help.

With WasmEdge, a cloud-native WebAssembly runtime hosted by the CNCF, developers can write high-performance serverless functions deployed on the public cloud or on edge computing nodes. In this article, we will explore how to use WasmEdge functions, written in Rust and in other languages, to power a Netlify application backend.

Why WebAssembly in Netlify Functions?

The Netlify platform already has a very easy-to-use serverless framework for deploying functions. As we discussed above, the reason to use WebAssembly, and WasmEdge, is to further improve performance. High-performance functions written in C/C++, Rust, Swift, and other languages can be easily compiled into WebAssembly. Those WebAssembly functions are much faster than JavaScript or Python commonly used in serverless functions.

However, if raw performance is the only goal, why not just compile those functions to machine native executables (Native Client or NaCl)? Netlify already runs these functions safely in AWS Lambda with the Firecracker microVM.

Our vision for the future is to run WebAssembly as an alternative lightweight runtime side-by-side with Docker and microVMs in cloud native infrastructure. WebAssembly offers much higher performance and consumes much less resources than Docker-like containers or microVMs. But for now, Netlify only supports running WebAssembly inside a microVM.

Running WebAssembly functions inside a microVM offer advantages over running containerized NaCl programs.

For starters, WebAssembly provides fine-grained runtime isolation for individual functions. A microservice could have multiple functions and support services running inside a microVM. WebAssembly can make the microservice more secure and more stable.

Second, the WebAssembly bytecode is portable. Developers only need to build it once and do not need to worry about changes or updates to the underlying Netlify serverless runtime. It also allows developers to reuse the same WebAssembly functions in other cloud environments.

Third, WebAssembly apps are easy to deploy and manage. They have much less platform dependencies and complexities compared with NaCl dynamic libraries and executables.

Finally, the WasmEdge Tensorflow API provides the most ergonomic way to execute Tensorflow models in the Rust programming language. WasmEdge installs the correct combination of Tensorflow dependency libraries, and provides a unified API for developers.

Enough with the concepts and explanations. Without further ado, let's jump into the example apps!

Prerequisite

Since our demo WebAssembly functions are written in Rust, you will need a Rust compiler. Make sure that you install the wasm32-wasi compiler target as follows, in order to generate WebAssembly bytecode.

$ rustup target add wasm32-wasi

The demo application front end is written in Next.js, and deployed on Netlify. We will assume that you already have the basic knowledge of how to work with Next.js and Netlify.

Example 1: Image processing

Our first demo application allows users to upload an image and then invoke a serverless function to turn it into black and white. A live demo deployed on Netlify is available.

Fork the demo application’s GitHub repo to get started. To deploy the application on Netlify, just add your github repo to Netlify.

This repo is a standard Next.js application for the Netlify platform. The backend serverless function is in the api/functions/image_grayscale folder. The src/main.rs file contains the Rust program’s source code. The Rust program reads image data from the STDIN, and then outputs the black-white image to the STDOUT.

use hex;
use std::io::{self, Read};
use image::{ImageOutputFormat, ImageFormat};

fn main() {
  let mut buf = Vec::new();
  io::stdin().read_to_end(&mut buf).unwrap();

  let image_format_detected: ImageFormat = image::guess_format(&buf).unwrap();
  let img = image::load_from_memory(&buf).unwrap();
  let filtered = img.grayscale();
  let mut buf = vec![];
  match image_format_detected {
    ImageFormat::Gif => {
        filtered.write_to(&mut buf, ImageOutputFormat::Gif).unwrap();
    },
    _ => {
        filtered.write_to(&mut buf, ImageOutputFormat::Png).unwrap();
    },
  };
  io::stdout().write_all(&buf).unwrap();
  io::stdout().flush().unwrap();
}

You can use Rust’s cargo tool to build the Rust program into WebAssembly bytecode or native code.

$ cd api/functions/image-grayscale/
$ cargo build --release --target wasm32-wasi 

Copy the build artifacts to the api folder.

$ cp target/wasm32-wasi/release/grayscale.wasm ../../

The Netlify function runs api/pre.sh upon setting up the serverless environment. It installs the WasmEdge runtime, and then compiles each WebAssembly bytecode program into a native so library for faster execution.

The api/hello.js script loads the WasmEdge runtime, starts the compiled WebAssembly program in WasmEdge, and passes the uploaded image data via STDIN. Notice api/hello.js runs the compiled grayscale.so file generated by api/pre.sh for better performance.

const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');

module.exports = (req, res) => {
  const wasmedge = spawn(
      path.join(__dirname, 'wasmedge'), 
      [path.join(__dirname, 'grayscale.so')]);

  let d = [];
  wasmedge.stdout.on('data', (data) => {
    d.push(data);
  });

  wasmedge.on('close', (code) => {
    let buf = Buffer.concat(d);

    res.setHeader('Content-Type', req.headers['image-type']);
    res.send(buf);
  });

  wasmedge.stdin.write(req.body);
  wasmedge.stdin.end('');
}

That's it. Deploy the repo to Netlify and you now have a Netlify Jamstack app with a high-performance Rust and WebAssembly based serverless backend.

Example 2: AI inference

The second demo application allows users to upload an image and then invoke a serverless function to classify the main subject on the image.

It is in the same GitHub repo as the previous example but in the tensorflow branch. The backend serverless function for image classification is in the api/functions/image-classification folder in the tensorflow branch. The src/main.rs file contains the Rust program’s source code. The Rust program reads image data from the STDIN, and then outputs the text output to the STDOUT. It utilizes the WasmEdge Tensorflow API to run the AI inference.

pub fn main() {
    // Step 1: Load the TFLite model
    let model_data: &[u8] = include_bytes!("models/mobilenet_v1_1.0_224/mobilenet_v1_1.0_224_quant.tflite");
    let labels = include_str!("models/mobilenet_v1_1.0_224/labels_mobilenet_quant_v1_224.txt");

    // Step 2: Read image from STDIN
    let mut buf = Vec::new();
    io::stdin().read_to_end(&mut buf).unwrap();

    // Step 3: Resize the input image for the tensorflow model
    let flat_img = wasmedge_tensorflow_interface::load_jpg_image_to_rgb8(&buf, 224, 224);

    // Step 4: AI inference
    let mut session = wasmedge_tensorflow_interface::Session::new(&model_data, wasmedge_tensorflow_interface::ModelType::TensorFlowLite);
    session.add_input("input", &flat_img, &[1, 224, 224, 3])
           .run();
    let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Reshape_1");

    // Step 5: Find the food label that responds to the highest probability in res_vec
    // ... ...
    let mut label_lines = labels.lines();
    for _i in 0..max_index {
      label_lines.next();
    }

    // Step 6: Generate the output text
    let class_name = label_lines.next().unwrap().to_string();
    if max_value > 50 {
      println!("It {} a <a href='https://www.google.com/search?q={}'>{}</a> in the picture", confidence.to_string(), class_name, class_name);
    } else {
      println!("It does not appears to be any food item in the picture.");
    }
}

You can use the cargo tool to build the Rust program into WebAssembly bytecode or native code.

$ cd api/functions/image-classification/
$ cargo build --release --target wasm32-wasi

Copy the build artifacts to the api folder.

$ cp target/wasm32-wasi/release/classify.wasm ../../

Again, the api/pre.sh script installs WasmEdge runtime and its Tensorflow dependencies in this application. It also compiles the classify.wasm bytecode program to the classify.so native shared library at the time of deployment.

The api/hello.js script loads the WasmEdge runtime, starts the compiled WebAssembly program in WasmEdge, and passes the uploaded image data via STDIN. Notice api/hello.js runs the compiled classify.so file generated by api/pre.sh for better performance.

const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');

module.exports = (req, res) => {
  const wasmedge = spawn(
    path.join(__dirname, 'wasmedge-tensorflow-lite'),
    [path.join(__dirname, 'classify.so')],
    {env: {'LD_LIBRARY_PATH': __dirname}}
  );

  let d = [];
  wasmedge.stdout.on('data', (data) => {
    d.push(data);
  });

  wasmedge.on('close', (code) => {
    res.setHeader('Content-Type', `text/plain`);
    res.send(d.join(''));
  });

  wasmedge.stdin.write(req.body);
  wasmedge.stdin.end('');
}

You can now deploy your forked repo to Netlify and have a web app for subject classification.

What's next

Running WasmEdge from Netlify’s current serverless container is an easy way to add high-performance functions to Netlify applications. Going forward an even better approach is to use WasmEdge as the container itself. There will be no Docker and no Node.JS to bootstrap WasmEdge. This way, we can reach much higher efficiency for running serverless functions. WasmEdge is already compatible with Docker tools. If you are interested in joining WasmEdge and CNCF for this exciting work, let us know!

Join us in the WebAssembly revolution!

👉 Slack Channel: #wasmedge on CNCF Slack channel

👉 Mailing list: Send an email to WasmEdge@googlegroups.com

👉 Be a contributor: checkout our wish list to start contributing!

👉 Twitter: WasmEdge

RustWebAssemblygetting-startedcloud-nativeserverless
A high-performance, extensible, and hardware optimized WebAssembly Virtual Machine for automotive, cloud, AI, and blockchain applications