I have recently started digging deeper into Amazon Web Services (AWS), and particularly into serverless lambda functions. After a few successful experiments with Python, I started wondering how hard it would be to get this stack running with Rust. Turns out, it wasn’t too hard after I had overcome a few of the initial hurdles which mostly consisted of figuring out how the different parts work together under the hood. This article is intended to serve as a guide on how to write serverless AWS lambda functions in Rust and how to deploy them onto Amazon Web Services.
Rust is one of only a handful of programming languages that make coding a really enjoyable experience for me: an expressive type system, strong memory-safety guarantees without a GC, high performance, and an awesome community. Whenever I can come up with an excuse to use Rust, I do just that — and this is no different.
Getting Started
For the purpose of this article, I assume you already have some experience with Amazon Web Services. You’ll need to get an account and properly configure your access tokens on your system to get this thing running online. The Free-Tier should work just fine and testing locally is possible without an account.
I also assume you’ve already got Rust/Cargo installed, as well as NodeJS 12. If not, please follow the guides for rustup and/or NodeJS respectively. Additionally, Docker is required to build the project.
The complete source code is available on my Github and I recommend cloning/forking the repository to follow the code while reading this guide.
What is an AWS Lambda Function?
Simply put, a lambda function is a regular (Rust) function that is being called when a client makes a request. The function receives the request payload as an argument, does some computation, and returns a response payload that is then being sent back to the client. This function can then be packaged and made accessible from the Internet without managing your own server infrastructure. Neat!
In this guide, we’ll be creating a function called lucky_numbers
which will generate a user-defined number of random numbers and return them to the client, along with a short message.
Setting Up Serverless Deployment
So how do we get this function to be called automatically? Before we dig into the Rust code, we’ll set up everything else first so that we can build, test, and deploy our Rust function once we get to it. This mostly happens automatically once everything is configured. We’ll be using the Serverless Framework, based on NodeJS. Let’s start by creating the project directory:
$ mkdir rust-lambda |
Then follow these steps:
- Run
npm init
and follow the prompts to create a Node project. - Install the Serverless Framework:
npm install -g serverless
. - Run
npm install --save-dev serverless-rust
to install the Rust plugin.
Finally, we’ll create the serverless.yml
config file. This file describes our AWS setup and how to deploy it (Infrastructure as Code, IoC). There’s a lot going on, so please take a moment to read through the comments:
# The name of the service. |
With the IoC stuff done, we can now turn our attention towards the actual Rust function. Let’s create the a folder for the crate by running mkdir lucky_numbers
and create a Cargo.toml
file to associate the crate with a workspace:
[workspace] |
Sweet! Let’s get started with Rust!
Oxidizing the Lambda Function
Unfortunately, quite a bit of boilerplate code is needed for the Lambda to work properly. So I recommend copying over the content of the lucky_numbers
directory from the Github repository. I’ll go through each relevant part of the code.
This should be the project structure:
- node_modules/ |
The file Cargo.toml
only lists a bunch of dependencies that are required, nothing special here. lambda_gateway.rs
, on the other hand, contains the boilerplate code mentioned before — some magic that connects our AWS Lambda Function with the AWS API Gateway — an integration layer, in other words. For our Rust lambda function, this module doesn’t concern us too much at the moment but I recommend checking it out later.
The Lambda Handler
Now this is where things finally get interesting! Let’s dive in.
As I have briefly mentioned before, a lambda function essentially receives the HTTP request’s (JSON) payload, computes the result, and returns it as a HTTP response (JSON) payload back to the client. To make this rather easy to accomplish, we’re defining two structs and use serde for serialization/deserialization.
/// This is the JSON payload we expect to be passed to us by the client accessing our lambda. |
Essentially, we will be taking the name
of the client and the requested number count
and return a friendly message
and the randomly generated lucky numbers
.
As usual, the main entry point of a Rust program is the main
function:
fn main() -> Result<(), Box<dyn std::error::Error>> { |
The main
function simply sets up logging (which can later be viewed on AWS CloudWatch and is incredibly helpful for debugging) and registers the lambda handler function which will be called for each incoming request onto the /lucky_numbers
endpoint we have previously defined in serverless.yml
. Here it is:
fn lambda_handler( |
As we can see from the function signature, we will receive InputPayload
and will return OutputPayload
that we have previously defined. We do some basic exemplary computation (transforming the name into uppercase and limiting the number of rolls to the range 2..=20
) and then generate the response OutputPayload
and return it with HTTP status code “200 OK” back to the client. Done.
Testing Locally
Let’s test the function we’ve just written and see if it works. Serverless Framework provides an easy way to do just this:
serverless invoke local -f lucky_numbers -d \ |
This may take same time to build the Docker image and compile all the Rust crates. Once everything is built and runs, the output should look something like this:
{ |
Awesome! Do note that the body is a JSON encoded string. This is how it is supposed to be. This entire response will be passed back through AWS API Gateway, which in strips away all status fields and creates the correct HTTP response that is then sent back to the client.
Deploying onto Amazon Web Services
If you have already created an AWS account and properly configured your local user, you should be able to deploy the service with serverless deploy
. Assuming the build and deployment were successful, you should see the API endpoint printed in the terminal. The URL should look something like this:
https://wkawb52awb.execute-api.us-east-1.amazonaws.com/dev/lucky_numbers
Once it’s deployed and we have that URL, everything should be up and running — let’s try calling our Rust Lambda with CURL:
curl -X POST \ |
There we go, we got our lucky numbers!
{ |
That’s it, Rust running in the cloud! In case anything went wrong, check AWS CloudWatch error logs to find out what happened.
From Here Onwards
Even though it wasn’t too difficult to get it working initially, it has been a rather clunky experience. This is still a novel realm for Rust, and documentation, examples, and tooling are lacking a certain level of polish at the moment. It works — but you probably shouldn’t use this in production at the moment. Nonetheless, it’s fun to experiment and there are great things to come!