Ruby Layers with Serverless

Ruby Layers with Serverless

After showing how easy it is to write AWS Lambda functions in Ruby, we will work on a way to build Layers with external dependencies or shared data in this post.

Lambda has been offering Layers since the end of 2018, which allows you to create libraries of shared code. These layers have to be created and uploaded first, then you can wire them up to your different functions. This offers an advantage on reuse and is a nice way to add even complex, big dependencies (as Lambda deployment packages are limited in size).

Layers and Serverless

In addition to the things we set up on our Ruby Lambda post, we will need to add several things. While Serverless know how to use Layers, we still need to build them. And as we might have dependencies with natively compiled libraries, everything needs to match the Lambda execution environment.

To achieve these goals, we can use the serverless-hooks-plugin which allows us to bolt on custom steps into the Serverless workflow and we will use Docker to load a suitable environment for bundling everything together. If you do not have Docker installed, the Docker documentation is pretty solid.

Putting it all together

Installing a Serverless plugin is easy: First, you have to create an initial package.json file which is the NodeJS-world equivalent of a Ruby Gemspec file. Then, installing NodeJS modules with the --save option (it is included from NPM 5 on, though) will also add it to the JSON file.

npm init --yes
npm install serverless-hooks-plugin --save

To prepare our Lambda Layer, we need to mkdir --parent ./layers/example/ and then create a layers/example/Gemfile for the dependencies in there:

source 'https://rubygems.org'

gem 'hello-world'

That not quite it, because we need some specific build script ./layers/example/build.sh in there having the right setup for Bundler:

#!/bin/bash

bundle install --path .
rm -rf .bundle/

Of course you could as well add downloading binary files and everything right after this or move everything into a Rakefile, if you like.

Finally, Serverless need to know about the Layer and how to build it in our serverless.yml:

plugins:
  - serverless-hooks-plugin

layers:
  example:
    path: layers/example
    description: Demo Layer
    compatibleRuntimes:
      - ruby2.5

custom:
  hooks:
    package:compileLayers:
      - docker run -v `pwd`/layers/example:`pwd` -w `pwd` lambci/lambda:build-ruby2.5 ./build.sh

functions:
  my_ruby_lambda:
    memorySize: 256
    timeout: 60
    handler: src/my_ruby_layer.main
    layers:
      - {Ref: ExampleLambdaLayer}

Notice the custom section which hooks right into packaging everything together: It will start the lambci/lambda:build-ruby2.5 Docker image which is a perfect match for AWS Lambda, map our layer into it and then execute our build script from before. This sets up everything in the layers/example directory so the Serverless layers section can zip it up and do the neccessary upload.

At the bottom of the snippet, our function gets wired up with that new layer. This statement is an intrinsic CloudFormation function, in which we uppercase the layer name and add LambdaLayer to the end to match everything happening in the SLS backend.

Now, you can sls deploy your function as before but have the full arsenal of Gems and binary files at your disposal as well.

Similar Posts You Might Enjoy

Ruby Lambdas with Serverless

Ruby Lambdas with Serverless Support for Ruby was added on AWS Lambda end of November 2018, with support via the Serverless Framework landing on the same day. - by Thomas Heinen

Dissecting Serverless Stacks (IV)

Dissecting Serverless Stacks (IV) After we figured out how to implement a sls command line option to switch between the usual behaviour and a way to conditionally omit IAM in our deployments, we will get deeper into it and build a small hack on how we could hand over all artefacts of our project to somebody who does not even know SLS at all. - by Thomas Heinen

Dissecting Serverless Stacks (III)

Dissecting Serverless Stacks (III) The third post of this series showed how to make IAM statements an external file, so we can deploy that one but still work with the sls command. It still involved commenting out things in the configuration, so this post will show how to solve that issue. - by Thomas Heinen