“Writing AWS Lambda Functions in Rust”

December 29, 2019   /   ~7 minute read

I have recently started digging deeper into Amazon Web Services (AWS), and particularly into serverless lambda functions. After a few successful experiments with Python, I started wondering how hard it would be to get this stack running with Rust. Turns out, it wasn’t too hard after I had overcome a few of the initial hurdles which mostly consisted of figuring out how the different parts work together under the hood. This article is intended to serve as a guide on how to write serverless AWS lambda functions in Rust and how to deploy them onto Amazon Web Services.

Cool. But… why Rust?

Rust is one of only a handful of programming languages that make coding a really enjoyable experience for me: an expressive type system, strong memory-safety guarantees without a GC, high performance, and an awesome community. Whenever I can come up with an excuse to use Rust, I do just that — and this is no different.

Ferris the crustacean

Getting Started

For the purpose of this article, I assume you already have some experience with Amazon Web Services. You’ll need to get an account and properly configure your access tokens on your system to get this thing running online. The Free-Tier should work just fine and testing locally is possible without an account.

I also assume you’ve already got Rust/Cargo installed, as well as NodeJS 12. If not, please follow the guides for rustup and/or NodeJS respectively. Additionally, Docker is required to build the project.

The complete source code is available on my Github and I recommend cloning/forking the repository to follow the code while reading this guide.

What is an AWS Lambda Function?

Simply put, a lambda function is a regular (Rust) function that is being called when a client makes a request. The function receives the request payload as an argument, does some computation, and returns a response payload that is then being sent back to the client. This function can then be packaged and made accessible from the Internet without managing your own server infrastructure. Neat! Ferris Smile

In this guide, we’ll be creating a function called lucky_numbers which will generate a user-defined number of random numbers and return them to the client, along with a short message.

Setting Up Serverless Deployment

So how do we get this function to be called automatically? Before we dig into the Rust code, we’ll set up everything else first so that we can build, test, and deploy our Rust function once we get to it. This mostly happens automatically once everything is configured. We’ll be using the Serverless Framework, based on NodeJS. Let’s start by creating the project directory:

$ mkdir rust-lambda
$ cd rust-lambda

Then follow these steps:

  1. Run npm init and follow the prompts to create a Node project.
  2. Install the Serverless Framework: npm install -g serverless.
  3. Run npm install --save-dev serverless-rust to install the Rust plugin.

Finally, we’ll create the serverless.yml config file. This file describes our AWS setup and how to deploy it (Infrastructure as Code, IoC). There’s a lot going on, so please take a moment to read through the comments:

# The name of the service.
service: rust-lambda

# We want to host the project on AWS using Rust.
name: aws
runtime: rust

# To automate building and deployment, we'll use a plugin specifically made for Rust.
- serverless-rust

# This tells the framework to package each function separately with its own container.
individually: true

# Here is where we define all our functions (each living in a separate Cargo crate).
# In this project, we only have one lambda function called `lucky_numbers`.
# The name of the handler must match the name of the crate (not the actual function defined within).
handler: lucky_numbers

# This tells AWS when to trigger our function.
# The event we're interested in is an incoming HTTP request.
- http:
# The function will be called for every POST request that is made
# onto the `/lucky_numbers` endpoint.
path: /lucky_numbers
method: POST

With the IoC stuff done, we can now turn our attention towards the actual Rust function. Let’s create the a folder for the crate by running mkdir lucky_numbers and create a Cargo.toml file to associate the crate with a workspace:

members = ["lucky_numbers"]

Sweet! Let’s get started with Rust! Ferris Smile

Oxidizing the Lambda Function

Unfortunately, quite a bit of boilerplate code is needed for the Lambda to work properly. So I recommend copying over the content of the lucky_numbers directory from the Github repository. I’ll go through each relevant part of the code.

This should be the project structure:

- node_modules/
- lucky_numbers/
- Cargo.toml
- src/
- lambda_gateway.rs
- main.rs
- Cargo.toml
- package.json
- package-lock.json
- serverless.yml

The file Cargo.toml only lists a bunch of dependencies that are required, nothing special here. lambda_gateway.rs, on the other hand, contains the boilerplate code mentioned before — some magic that connects our AWS Lambda Function with the AWS API Gateway — an integration layer, in other words. For our Rust lambda function, this module doesn’t concern us too much at the moment but I recommend checking it out later.

The Lambda Handler

Now this is where things finally get interesting! Let’s dive in. Ferris Smile

As I have briefly mentioned before, a lambda function essentially receives the HTTP request’s (JSON) payload, computes the result, and returns it as a HTTP response (JSON) payload back to the client. To make this rather easy to accomplish, we’re defining two structs and use serde for serialization/deserialization.

/// This is the JSON payload we expect to be passed to us by the client accessing our lambda.
#[derive(Deserialize, Debug)]
struct InputPayload {
name: String,
count: u32,

/// This is the JSON payload we will return back to the client if the request was successful.
#[derive(Serialize, Debug)]
struct OutputPayload {
message: String,
numbers: Vec<u8>,

Essentially, we will be taking the name of the client and the requested number count and return a friendly message and the randomly generated lucky numbers.

As usual, the main entry point of a Rust program is the main function:

fn main() -> Result<(), Box<dyn std::error::Error>> {

The main function simply sets up logging (which can later be viewed on AWS CloudWatch and is incredibly helpful for debugging) and registers the lambda handler function which will be called for each incoming request onto the /lucky_numbers endpoint we have previously defined in serverless.yml. Here it is:

fn lambda_handler(
e: LambdaRequest<InputPayload>,
_c: Context,
) -> Result<LambdaResponse, HandlerError> {
let payload = e.body();
let name = &payload.name.to_uppercase();
let count = std::cmp::min(std::cmp::max(2, payload.count), 20);

let response = LambdaResponseBuilder::new()
.with_json(OutputPayload {
message: format!("Hi, '{}'. Your lucky numbers are:", name),
numbers: (1..=count).map(|_| thread_rng().gen_range(1, 42)).collect(),


As we can see from the function signature, we will receive InputPayload and will return OutputPayload that we have previously defined. We do some basic exemplary computation (transforming the name into uppercase and limiting the number of rolls to the range 2..=20) and then generate the response OutputPayload and return it with HTTP status code “200 OK” back to the client. Done. Ferris Smile

Testing Locally

Let’s test the function we’ve just written and see if it works. Serverless Framework provides an easy way to do just this:

serverless invoke local -f lucky_numbers -d \
'{"body": "{\"name\":\"SilentByte\", \"count\": 10}"}'

This may take same time to build the Docker image and compile all the Rust crates. Once everything is built and runs, the output should look something like this:

"isBase64Encoded": false,
"statusCode": 200,
"headers": {
"content-type": "application/json"
"body": "{\"message\":\"Hi, 'SILENTBYTE'. Your lucky numbers are:\",\"numbers\":[5,15,6,37,14,20,19,19,32,25]}"

Awesome! Ferris Party Do note that the body is a JSON encoded string. This is how it is supposed to be. This entire response will be passed back through AWS API Gateway, which in strips away all status fields and creates the correct HTTP response that is then sent back to the client.

Deploying onto Amazon Web Services

If you have already created an AWS account and properly configured your local user, you should be able to deploy the service with serverless deploy. Assuming the build and deployment were successful, you should see the API endpoint printed in the terminal. The URL should look something like this:


Once it’s deployed and we have that URL, everything should be up and running — let’s try calling our Rust Lambda with CURL:

curl -X POST \
-H "Content-Type: application/json" \
-d '{"name":"SilentByte","count": 10}' \

There we go, we got our lucky numbers!

"message": "Hi, 'SILENTBYTE'. Your lucky numbers are:",
"numbers": [8, 23, 26, 21, 32, 24, 8, 40, 34, 2]

That’s it, Rust running in the cloud! Ferris Smile In case anything went wrong, check AWS CloudWatch error logs to find out what happened.

From Here Onwards

Even though it wasn’t too difficult to get it working initially, it has been a rather clunky experience. This is still a novel realm for Rust, and documentation, examples, and tooling are lacking a certain level of polish at the moment. It works — but you probably shouldn’t use this in production at the moment. Nonetheless, it’s fun to experiment and there are great things to come!

…oxidization successful! Ferris Party