Serverless computing, or FaaS (Functions-as-a-Service) lets developers focus on building event-based applications on a function by function basis while it takes care of deploying, running and scaling the code. It doesn’t mean that no servers are involved, but rather that developers don’t need to worry about traditional infrastructure issues and operational details, such as provisioning, maintaining and scaling servers.
A recent DevOps Pulse Survey by log.zio showed that only 30% of responders were currently using serverless. Compared to Docker, which was quickly put to use and changed how developers built code, serverless is taking more time to catch on. Nonetheless, the general usage trend is pointing upwards.
There are various serverless products or frameworks in the market, and in this post, we’ll do a close comparison of the three major cloud platforms.
The “serverless revolution” started with the launch of Amazon’s Lambda in 2014, and AWS is often thought of as synonymous with the idea of serverless itself. It wasn’t until 2016 that Google introduced Cloud Functions, and shortly thereafter, Microsoft released Azure Functions, although it is now ahead of Google in terms of its development curve.
These are the three main providers of serverless platforms, being put to use in the entire backend services of massive IT organizations like Netflix and Dropbox.
AWS Lambda
AWS is seen as the originator of the serverless concept, and has been the most successful at completely integrating it into its cloud portfolio. To activate Lambda, according to its site, “just upload your code and Lambda takes care of everything required to run and scale your code with high availability”.
Lambda offers native support for a range of runtime environments, including JavaScript, Python, NodeJS and C#, and allows for the establishment of a wrapper that can be added to Go, PHP, or Ruby projects, allowing execution of the code when triggered so that it scales precisely with the size of the workload.
Lambda acts as a gateway to almost every other cloud service Amazon offers. Integration with S3 and Kinesis permits log analysis and on-the-fly filtering or video transcoding and backup—set off by activities in these AWS services. The DynamoDB integration supports an additional layer of triggers for operations completed outside the real-time echo system, allowing you to use Lambda to perform data validation, filtering, sorting or other transformations for data change, and load the transformed data to another data store. Additionally, Lambda can behave as the full backend service of a web, mobile, or IoT application—receiving client requests through the Amazon gateway.
Managing your functions in Lambda allows you to trigger them from a wide range of sources, including Amazon’s Alexa, which can trigger Lambda events and analyze voice-activated commands. Additionally, Amazon lets users define workflows for more complex triggers, including decision trees for alternative business use-cases originating from the same function.
In terms of pricing for Lambda, you only pay for the compute time you consume – one of the main advantages to using serverless platforms generally. However, as Asaf Yigal pointed out in his recent comparison post for log.zio, this presents a fundamental problem for applications developed using a serverless architecture in terms of maintaining the state of a function. As you only pay for function execution duration, “the natural behavior of the platform is to shut down functions as soon as they have completed their task… This presents a difficulty for developers, as functions cannot use other functions’ knowledge without having a third-party tool to collect and manage it.”
Amazon recently released a new module called AWS Step Functions in order to address this issue head on. It logs the state of each function, therefore it can be used by alternative functions or for root-cause analysis.
Amazon offers subsecond metering, charging for function triggering and execution in 100 ms units. The first one million requests each month come free, and every 100,000 requests thereafter cost $0.02—apportioned between request and compute charges.
Microsoft Azure Functions
Prior to the official release of Azure Functions, Microsoft’s cloud platform already had various serverless services. Its platform as a service (PaaS) features undertook various functions for you and allowed you to scale up as needed. As a developer, you could size and pay depending on what processing was necessary, rather than by the physical resources used.
Azure Functions was rolled out in March 2016. Microsoft is working to close the functionality with AWS, but Azure Functions offers a narrower scope in terms of overall functionality largely because it lacks Amazon’s broad cloud portfolio.
However, it does offer a range of practical functionality and powerful integrations. Through its tool, Create Functions, Microsoft allows you to create functions in native languages, such as C# and Javascript, either inside the web functions editor, or to upload them using your favoured development tool, such as PHP or Batch.
Azure has easy integrations with a range of external services, such as VS Team Services, Bitbucket and Github, allowing the deploying of code in the cloud.
Azure Functions supports an event-driven approach, meaning that whenever something interesting happens within the cloud environment, you can trigger a related function. Cron jobs enable timer-based events, for example, for scheduled tasks, while events on Microsoft’s SaaS services, SharePoint or OneDrive can be arranged to trigger operations in Functions.
Microsoft is now attempting to service less technical users by making serverless simpler and less daunting for non-coders. Microsoft called its Logic Apps, a “workflow orchestration engine in the cloud,” which allows you to set and manage data processing tasks and to assign workflow paths.
Azure’s cloud function usage is billed in the same way as Amazon, calculating the total cost from both number of triggers and execution time. The same pricing structure also applies: the first one million requests are free; thereafter, exceeding this limit will cost only $0.02 for every 100,000 executions and another $0.02 for the 100,000 GB/s.
Microsoft acquired Cloudyn, a cloud management and cost optimization software producer, in June 2017 to add to its cloud platform. Microsoft launched Azure Cost Management in October, which allows you to track cloud usage and expenditures for Azure resources and other cloud providers including AWS and Google, enabling users to make aware decisions on utilization and cloud efficiency.
Google Cloud Functions
Google was the last of the big three to add a serverless option, emerging first in Alpha mode in February 2016 and not hitting Beta until March 2017. It is still designing its serverless platform, but the current iteration remains very limited compared to the other two, allowing functions only to be written in JavaScript and executed in a standard Node.js runtime environment. Events can solely be triggered on Google’s internal event bus—Cloud Pub/Subtopics. Webhooks via HTTP triggers are also supported, in addition to mobile events from Google’s mobile platform for app developers, Firebase. Python programmer, David Mytton, writing for A Cloud Guru, noted “the focus of Google’s serverless ambitions seem to be Firebase, not Cloud Functions”.
Logs are automatically emitted from your Cloud Functions and written to the Stackdriver logging tool, which performs at scale and is capable of ingesting application and system log data from thousands of VMs; however, some developers say it does not supply all the information and data that serverless users require to make the most informed decisions on utilization and efficiency.
Google is also missing integrations with storage and other key cloud services that assist with business-related triggers, but the most significant problem is that Google currently restricts projects to having fewer than 20 triggers.
Finally, Google is charging the most. After one million free requests, its prices are double those of AWS and Microsoft: $0.04 for 100,000 invocations, plus $0.04 for 100,000 milliseconds.
In Conclusion
Popular opinion still favors AWS because of the sheer size of its product portfolio. When it comes to serverless, what matters most is the availability of, and consumption of, other services within the cloud provider ecosystem, and it is hard currently for Microsoft and Google to equal AWS’ extensive offerings in this arena. Being able to execute functions in response to events is only useful as far as its limits extend. AWS also continues to rapidly develop new features, such as its voice platforms.
For all three, challenges remain around security, monitoring, deployment and management of upgrades.
Before you make a major decision, however, remember that all three services are continually evolving.
Another tool that can be used in making a decision is the Serverless Cost Calculator developed by Peter Sbarski and the A Cloud Guru team, allowing eventual costs to be estimated according to predicted number of executions and average execution time.
If execution time exceeds goals, request numbers spike, and triggers are left unmonitored, costs can quickly multiply and reduce the value of serverless altogether.
Monitoring tools to process and analyse logs and metrics are also important. Some of the most common include, ELK stack, Datadog and New Relic.