AWS Lambda Ruby Support at Native Speed with Jets
Update 2018/12/12: Official Ruby Support was announced at AWS re:Invent 2018 on Nov 29! Jets has switched over to it: Official AWS Ruby Support for Jets š. This article is now out-of-date and kept around only for posterity.
AWS Lambda does not yet support Ruby. Though there are plenty of rumors that AWS is working on it. Iām pretty excited for the day when AWS releases official support for Ruby. Until that day arrives though, we must use a shim in order to add Ruby support to AWS Lambda. A shim is a function written in a natively AWS Lambda supported language that calls out to Ruby. Jets uses a node shim to add Ruby support to AWS Lambda. The neat thing is that Jets adds Ruby support to AWS Lambda at pretty much native speed.
Performance Comparision
Right off the bat, hereās a performance comparison.
Ruby function speed:
time curl -so /dev/null https://1192eablz8.execute-api.us-west-2.amazonaws.com/dev/ruby_example
real 0m0.164s
user 0m0.039s
sys 0m0.063s
Python function speed:
time curl -so /dev/null https://1192eablz8.execute-api.us-west-2.amazonaws.com/dev/python_example
real 0m0.178s
user 0m0.047s
sys 0m0.054s
In the case above, the Ruby function happened to be faster than the Python function. Generally, itās the same.
To understand how Jets achieves this level of performance for Ruby support, it is useful to understand a little bit of how AWS Lambda works.
Cold Start Issues
The main issue with shims is the overhead associated with them equates to a cold start every time. Cold starts are a pretty known and have been documented by many people:
- Understanding AWS Lambda Coldstarts
- Solving the Cold Start Problem
- Dealing with cold starts in AWS Lambda
- Everything you need to know about cold starts in AWS Lambda
- Cold starting AWS Lambda functions
If your Lambda function is allocated only 128MB the coldstart overhead can be a few seconds. Even with officially support languages like Java it can take longer. Itās because it takes a little time to load the JVM runtime. Worse yet, if you are using AWS Lambda functions connected to a VPC, then weāre talking about a 10+ seconds overhead penalty š¤¦š»āāļø This is one of the reasons why it is recommended to use Lambda without the VPC feature when possible. Generally, AWS Lambda developers remedy the cold start problem by prewarming their application. Prewarming essentially takes advantage of how the AWS Lambda Execution Context works.
AWS Lambda Execution Context
What is the AWS Lambda Execution Context?
AWS Lambda functions execute in what is called the Execution Context. From the official AWS docs:
Execution Context is a temporary runtime environment that initializes any external dependencies of your Lambda function code, such as database connections or HTTP endpoints. This affords subsequent invocations better performance because there is no need to ācold-startā or initialize those external dependencies.
It takes time to set up an Execution Context and do the necessary ābootstrappingā, which adds some latency each time the Lambda function is invoked. You typically see this latency when a Lambda function is invoked for the first time or after it has been updated because AWS Lambda tries to reuse the Execution Context for subsequent invocations of the Lambda function.
After a Lambda function is executed, AWS Lambda maintains the Execution Context for some time in anticipation of another Lambda function invocation.
Now that cold starts and the Lambda execution context has been covered you can start to see how Jets achieves a high level of performance for Ruby support.
Ruby Support at Native Speed
Jets takes advantage of how AWS Lambda works and itās Execution Context. Jet loads and keeps the Ruby interpreter in the Execution Contextās memory, essentially giving Ruby native-like speed. Additionally, Jets has built-in Prewarming Support. This makes running Ruby on AWS Lambda pretty much native speed. š
Check out the Live Demo.
More info
- For an Jets Introduction: Introducing Jets: A Ruby Serverless Framework.
- Also more info at: Jets documentation site.
Jets Links and Tutorial Series
- Introducing Jets: A Ruby Serverless Framework
- Toronto Serverless Presentation: Jets Framework
- Jets Afterburner: Serverless Rails in 5 Minutes
- Mega Mode: Rails on AWS Lambda
- An Introductory CRUD App Part 1
- Deploy to AWS Lambda Part 2
- Debugging Logs Part 3
- Background Jobs Part 4
- IAM Policies Part 5
- Function Properties Part 6
- Extra Environments Part 7
- Different Environments Part 8
- Polymorphic Support Part 9
- Jets Delete Tutorial
- Jets Image Uploads Tutorial with CarrierWave
- Cron Job Tutorial: Backup Route53
- Build an API with the Jets Ruby Serverless Framework
Thanks for reading this far. If you found this article useful, I'd really appreciate it if you share this article so others can find it too! Thanks š Also follow me on Twitter.
Got questions? Check out BoltOps.
You might also like
More tools:
-
Kubes
Kubes: Kubernetes Deployment Tool
Kubes is a Kubernetes Deployment Tool. It builds the docker image, creates the Kubernetes YAML, and runs kubectl apply. It automates the deployment process and saves you precious finger-typing energy.
-
Jets
Jets: The Ruby Serverless Framework
Ruby on Jets allows you to create and deploy serverless services with ease, and to seamlessly glue AWS services together with the most beautiful dynamic language: Ruby. It includes everything you need to build an API and deploy it to AWS Lambda. Jets leverages the power of Ruby to make serverless joyful for everyone.
-
Lono
Lono: The CloudFormation Framework
Building infrastructure-as-code is challenging. Lono makes it much easier and fun. It includes everything you need to manage and deploy infrastructure-as-code.