faastRuby 0.5 is out!

Build, test and run serverless applications in Ruby & Crystal with faastRuby Local.

The Challenge

Serverless is the next evolution of compute but 1st generation offerings have 3 major problems:


Existing serverless offerings suffer from the “Cold Start” problem, limiting use cases to gluecode and APIs


Existing serverless offerings suffer from the “Cold Start” problem, limiting use cases to gluecode and APIs


Existing serverless offerings suffer from the “Cold Start” problem, limiting use cases to gluecode and APIs

The Solution

faastRuby provides developers with everything they need to build serverless apps

No Cold Starts!

Your code is executed on demand inside always warm "Runners," meaning you can build full featured web applications with functions. 

Focus only on code

Drop your code into function templates on your local device and your code can become synchronized to the cloud in seconds

Simple Billing

Simple, Digitalocean-esque pricing. $5 per runner per month

Serverless Software Development Platform

faastRuby is a Serverless Software Development Platform that dramatically accelerates the development of serverless functions. The platform provides everything developers need to develop, test and run their functions in the cloud.

Developers can work faster by focusing on code and business logic instead of wasting time with configuration, web servers, containers, API gateways or complicated frameworks.

Engineering Leaders can increase development velocity by removing virtually all of the operational impediments that traditionally hinder development team productivity. Ideas can become live prototypes in minutes.



Develop and test on your local machine from your favorite code editor. Optionally synchronize your local code changes to cloud-based staging environments instantly.


Deploy your functions to a "workspace" in the cloud. Workspaces are groups of your functions that can be used to mimic environments or to compose a set of routes for a distributed app built with functions.


When you invoke your functions via an HTTP request, the code is executed in a self-contained, isolated sandbox inside of an always-warm "Runner."

Runners are secure, stateless, always-running, compute units capable of accessing any function code instantly at the moment of execution, eliminating cold start times.

faastRuby Local

Everything you need to build & test serverless functions locally

FREE and OPEN SOURCE, faastRuby Local provides developers with everything they need to develop and test serverless functions on their local machine, from within their favorite code editor. Local code changes can be synchronized with live staging environments in the cloud, so you'll know exactly how your production environments will behave in production. When you're done run:

$ faastruby deploy


Local testing - Testing functions locally speeds up development time and reduces errors

Cloud Sync - Changes you make to your code locally are deployed to the cloud in real time, so you can develop directly against live, production-like environments

Live Compile - faastRuby Local will trigger a compile action for Crystal Functions every time you make a change to their files

Integrations - Integrate faastRuby Local with your favorite code editors like VScode, VIM and Atom and your preferred private or public Git providers

faastRuby Cloud

Manages the availability and scale of your functions in the cloud so that you don't have to

faastRuby is a Function-as-a-Service (FaaS) platform that autonomously manages the availability and scale of your functions in the cloud. Functions run inside of runners: secure, stateless, always-running compute units capable of accessing any function code instantly at the moment of execution, eliminating cold start times.


No Ops Required - Your functions are available via HTTP endpoints that are fully managed by the platform, so you don't have to think about API gateways.

No Cold Starts - Runners are always available to handle requests, so functions are executed instantaneously.

Schedule Period Runs - The platform has a built-in scheduler. Write the times in plain english and faastRuby will run them for you.

Self Hosted

Run the faastRuby Platform on your own infrastructure

faastRuby Self Hosted brings the speed and power of faastRuby Cloud to your own infrastructure. Via native integrations with with popular container orchestration engines, Self hosted faastRuby reproduces the same developer experience as faastRuby Cloud, but in your data center or public cloud VPC.


Your Infrastructure optimized - Leverage existing contractual arrangements with IaaS clouds, or deploy in your own data centre and benefit from faastRuby's super efficient consumption.

Bring Your Own-FaaS - Bring the power of faastRuby to your favorite infrastructure provider(s) for a true multicloud experience

Use your favorite orchestrator - faastRuby integrates with popular container orchestration systems like Docker Swarm, Kubernetes & OpenShift coming soon !

Self Hosted FaaStRuby


Free Tier

1 FREE runner / month


faastRuby Local

Cloud Sync

Managed Infrastructure

Community Support


$5 / runner / month


faastRuby Local

Cloud Sync

Managed Infrastructure

Priority Email Support

Self Hosted

Contact us


faastRuby Local

Cloud Sync

Self Managed Infrastructure

Priority Email & Chat Support


How does hourly pricing work?

Runners are billed hourly up to a monthly cap of 720 hours. At least 1 runner must always be assigned to an account in order for your functions to be called.

If you use additional runners for fewer than 720 hours during the month, you will be billed for each hour that you used it. If you use a runner for more than 720 hours that month, you will be billed at the monthly cost.

We calculate Runner pricing this way so that consistent month-to-month usage results in a consistent invoice.

How many Runners will I need for my functions?

Runners execute functions as soon as they are triggered. You should try to keep your functions slim - the faster they are, the more requests per second you can serve with a single Runner. To calculate how many Runners you will need, first deploy your functions to a cloud workspace and watch their execution time by making requests with the header Benchmark: true. Then estimate how many requests a Runner could serve per second based on the execution time of each function. Divide the number of requests per second you would like to be able to serve by the number you calculated previously and that will give you the number of Runners you need.

For example, say you have a function that executes in 100ms. That means you can serve that function at a rate up to 10 req/s with one Runner. If you want to serve that function at a rate up to 20 req/s, it will need at least 2 Runners.

What happens when all my Runners are busy and other requests arrive?

Runners will do the best they can to serve all your requests. When all your Runners are busy, every subsequent request gets stored in a temporary queue and discarded if they are waiting for more than 10 seconds for a free Runner. When this 10-second timeout is reached, the request will fail with a 429 - Too Many Requests.

Product 0.5.0