Serverless AI functions on Tencent Cloud

• 5 minutes to read

Tencent is a leading serverless infrastructure provider in the public cloud. With the SSVM custom runtime for Tencent Serverless Cloud Function (SCF), you can write a few lines of simple code to turn any Tensorflow model into a serverless function and then offer it as a web service. A template project containing source code and configuration files is available here. Out of the box, you can deploy the template to Tencent Cloud and have a working web app for image classification.

Screencast | Live demo!

You can simply fork and make changes to the template – changing to another Tensorflow model, customizing the interpretation and display of inference results, updating the web UI, etc. – and then deploy to Tencent Cloud in minutes using the Serverless Framework. In this tutorial, I will show you how to make these changes.

Prerequisites

Follow these simple instructions to install Rust, ssvmup, and the Serverless Framework. Make sure that you install the --enable-aot extension for ssvmup.

Alternatively, you can use Github Codespaces or even just Docker to build and run the template.

Changing to a different TensorFlow model

The src/main.rs file in the template is a Rust program that takes the input image and then executes the Tensorflow Lite (TFLite) model against the image data.

Rust is the most beloved programming language for Stackoverflow users in the past five years in a row. It could look complicated at first glance. But as you can see from the examples, the Second State VM's Rust API is really straightforward and easy to get started.

The relevant code is as follows.

fn main() {
    // 1. Load the TFLite model file and the probability label file
    let model_data: &[u8] = include_bytes!("lite-model_aiy_vision_classifier_food_V1_1.tflite");
    let labels = include_str!("aiy_food_V1_labelmap.txt");

    // 2. Load the uploaded image into the img_buf Vector 
    ... ...
    
    // 3. Resize the img_buf to the size required by the tensorflow model's input tensor
    let flat_img = ssvm_tensorflow_interface::load_jpg_image_to_rgb8(&img_buf, 192, 192);

    // 4. Use the image as an input tensor, run the model, and retrieves the output tensor.
    let mut session = ssvm_tensorflow_interface::Session::new(&model_data, ssvm_tensorflow_interface::ModelType::TensorFlowLite);
    session.add_input("input", &flat_img, &[1, 192, 192, 3])
           .run();
    let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Softmax");

    // 5. The output tensor is a list of probabilities (0 to 255) for each label in the `labelmap.txt` file.
    ... ...
    let mut label_lines = labels.lines();
    for _i in 0..max_index {
      label_lines.next();
    }

    let class_name = label_lines.next().unwrap().to_string();
    ... ...
}

To use another Tensorflow model, you need to do the following.

  1. Load your own Tensorflow model file and its associated data file. We support both TFLite and TF frozen model files. You could load another MobileNet image classification model you trained yourself or a completely different model.
  2. Load the uploaded image or other model input. See the next section for more details.
  3. Prepare the input data to the model input tensor's requirement. In the case of MobileNet models, we resize the image and load the pixel values into a vector.
  4. Pass the input data, input tensor names, and output tensor names to the model.
  5. Capture the output tensor values in a vector and process them to generate a human-readable result. In the case of MobileNet models, the output values correspond to the classification probability of each label in the label map. We output the label with the highest probability.

To see more examples, check out the following examples.

Input and output

Since the Tensorflow serverless function runs in the Tencent Cloud infrastructure, it needs to interact with the Tencent Cloud's API gateway to handle web requests. The Tencent API gateway wraps the entire incoming HTTP request in a JSON object and sends it to the function via STDIN. Therefore, the src/main.rs function needs to read data from STDIN, parse the JSON object's body field for the base64 encoded image data, and then load the image into the img_buf vector.

You typically do NOT need to change this part of the template, but it is useful to understand how it works.

fn main() {
    ... ...
    let mut buffer = String::new();
    io::stdin().read_to_string(&mut buffer).expect("Error reading from STDIN");
    let obj: FaasInput = serde_json::from_str(&buffer).unwrap();
    let img_buf = base64::decode_config(&(obj.body), base64::STANDARD).unwrap();
    ... ...
}

#[derive(Deserialize, Debug)]
struct FaasInput {
    body: String
}

The function returns the inference results to STDOUT using the println! statement.

if max_value > 50 && max_index != 0 {
    println!("It {} a <a href='https://www.google.com/search?q={}'>{}</a> in the picture", confidence.to_string(), class_name, class_name);
} else {
    println!("It does not appears to be any food item in the picture.");
}

The web app

The optional web UI for the serverless function is available in the website/content/index.html file in the template. You can change it to suit your own application needs. The key part of this UI is the JavaScript code to convert a selected image file into a base64 text string and then do an HTTP POST to send this base64 text string to the serverless function's API gateway URL.

var reader = new FileReader();
reader.readAsDataURL(document.querySelector('#select_file').files[0]);
reader.onloadend = function () {
    $.ajax({
        url: window.env.API_URL,
        type: "post",
        data : reader.result.split("base64,")[1],
        dataType: "text",
        success: function (data) {
            document.querySelector('#msg').innerHTML = data;
        },
        error: function(jqXHR, exception){
            document.querySelector('#msg').innerHTML = 'Sorry, there is a problem. Try later';
        }
    });
};

As we discussed, the Tencent serverless runtime will turn that POST string into the body field in a JSON object and then pass the JSON object to the Rust function.

Build and deploy

To build the application, we use the ssvmup tool to create a .so file. It is an AOT-compiled WebAssembly function that is both high performance and safe. The default file name is pkg/SCF.so.

$ ssvmup build --enable-ext --enable-aot
$ cp pkg/scf.so scf/

The template is organized according to the Serverless Framework structure.

  • The layer project adds Tensorflow and SSVM libraries to the SCF custom runtime.
  • The scf project creates the custom serverless runtime and its associated API gateway.
  • The website project creates a web site with the content files. The deployment script automatically connects the web page JavaScript to the SCF API gateway URL from the previous step.

To deploy the application on Tencent Cloud, you can just run the following command from Serverless Framework.

$ SLS deploy
... ...
  website:       https://sls-website-ap-hongkong-kfdilz-1302315972.cos-website.ap-hongkong.myqcloud.com
  vendorMessage: null

63s › Tencent-TensorFlow-SCF › "deploy" ran for 3 apps successfully.

You can now access the web application from the deployed website URL.

What's next

There are other SSVM-based TensorFlow-functions-as-a-service examples on Tencent Cloud. Check them out here.

RustJavaScriptWebAssemblyTencent CloudServerless Cloud FunctionServerless FrameworkFaaSRust FaaSServerlesscloud computing
Fast, safe, portable and serverless Rust functions as services