Digital Matters: Serverless is the New Sexy

02 October 2018

Digital Isle of Man board member, Phil Adcock, discusses datacentres and the future of serverless computing.

Whilst it may be common knowledge that the Isle of Man is fortunate enough to boast six tier 3 datacentres (certainly a significant number for our size), it may be less well known that the Island has the ability to generate up to 200% of peak electricity demand, meaning that these crucial pieces of infrastructure have the ability to meet and exceed customers’ power requirements.  

The Isle of Man provides a home to a significant amount of data storage to e-Gaming and Digital businesses and so it’s essential for the Island to continue to develop its datacentre infrastructure for businesses around the world. The Island’s new executive agency, Digital Isle of Man, has a mandate to support product development across our significant and thriving digital sector.

If you work in IT, it is likely you have either visited a datacentre or managed servers hosted in one. Datacentres are designed to solve one particular problem: outsourcing the headache of delivering redundant power, data communications and environmental controls at an affordable cost. Their growth was initially fuelled by the commercialisation of the internet and more recently by an explosion in web applications, mobile apps and a huge increase in the data that we create and consume.

Our appetite for data looks set to continue. Research group IDC have predicted that by 2025, the world will be creating over 160 zettabytes of data a year, which is rather a lot. A zettabyte is a trillion gigabytes by the way, in case you were reaching for Google. From an app perspective, the fact that mobile apps are relatively easy to create and cheap to buy has translated into spectacular growth. It is estimated that there are nearly six million apps housed across Google’s and Apple’s stores. As a case in point, the Google Play store gained 100,000 apps in the first quarter of 2018 alone.

In many cases, the architectures of modern web and mobile apps differ from traditional server-based applications. They are frequently event driven, tend to consume third party services, have rapid release cycles and often have to scale quickly. Datacentres have evolved to accommodate these changes with a range of subscription services that better suit the demands of these applications.

An obvious example is cloud computing. It has been with us for a number of years now and the idea of outsourcing hardware management to a datacentre and running virtual machines is part of everyday life for an IT engineer. Cloud computing gives us many benefits. Buying and maintaining hardware becomes someone else’s problem and we don’t have to worry about the capital expense of new servers when we need more computing power. Cloud computing also makes much better use of server resources. Most non-virtualised physical servers typically use less than 10 percent of what’s available, but in a virtualised environment you can bundle many virtual servers on to one piece of hardware to use all of it. It is also easy to start small and align your costs with growth which keeps the CFO happy.

Virtualisation doesn’t solve every problem, of course. A decision has to be made about the resources a virtual machine needs and capacity planning remains an issue, even though it is much easier to scale when you hit resource limits. There is also the problem of server sprawl. Many administrators fall into the trap of creating a server for everything because it is so easy. Soon you find that instead of ten servers, you are now managing thirty. There is also an operating system to manage which requires in-house expertise.

This is where the concept of containerisation comes in. A container is a simplified virtual machine that provides an environment for a piece of software to run. Think of a container as a packaging mechanism which bundles an application with its libraries, dependencies, configuration files and binaries into one portable capsule. If you’ve ever used a container, you will find it behaves like a virtual machine. You can log in, configure IP addresses and network interfaces, mount file systems and so on.

The fundamental difference between containers and virtual machines is that they abstract the operating system. Put another way, multiple containers run on a shared operating system which makes them lightweight and efficient. Compare that to a virtual machine which has to bundle a complete operating system with its applications. Think megabytes rather than gigabytes. This means that a server can host significantly more containers than virtual machines and they can be easily moved across computing environments. Since there is no operating system to boot, containers can be created in seconds rather than minutes which allows much quicker reaction times to an increased workload. Containers are also a natural fit for the microservices architectural style common with web applications, where software is split into smaller modules communicating via an API. Almost every company that deploys web services on a large scale is using some form of containerisation.

The most popular container environment by far is Docker, primarily because of its ease of use, wide range of tools and access to the Docker Hub, which you can think of as an app store for containers. Other container solutions exist, two of the more popular ones being Rkt (pronounced “Rocket”) and LXD (pronounced “Lexdi”).

Containers come with their own set of challenges of course. Since they run on a shared operating system kernel, they are less isolated than virtual machines and potentially less secure. Managing a large number of containers can become complicated although this is mitigated to some extent by orchestration services such as Docker Swarm, Apache Mesos and Kubernetes from Google. Networking can also be tricky and storage isn’t persistent. If you want to build a database that survives the lifetime of the container, then you will have to mount some external storage.

More recently, the concept of serverless computing has appeared which puts additional distance between the software and the infrastructure that runs it. The idea of serverless is to let developers focus on their code and let the cloud provider worry about where to run it and how to resource it. Often described as “function as a service”, serverless lets you upload a small piece of code (for example a Python function) to a cloud service provider which is executed on demand. Serverless is event driven and execution is usually based on a trigger, such as a file being uploaded into a particular location or a web request.

It’s a bizarre paradigm to get your head around at first, but if you think about it, serverless is an incredibly powerful approach to developing web apps. First of all, serverless encourages you to embrace third party services which means that applications can be developed very quickly. Third party services can be used for a wide range of applications including authentication (Auth0), file storage (Amazon S3), databases (MongoDB Atlas, Amazon RDS) and payments (Paypal, Stripe) to name but a few. If there is already a well trodden path, why not outsource those functions to other providers and focus on your core product?

Where serverless gets particularly interesting is that it lets you create a chain of triggers, functions and services which are executed automatically based on an event. As a simple example, consider a web application which reformats images uploaded by a user. The trigger event would be the image being uploaded into a particular directory on a cloud storage service. The trigger runs a function which reformats the image and places it in an output directory, which in turn triggers a second function that emails the user with a download link. Examples of triggers can include changes in system state, database updates, log file changes and device telemetry. This makes it a perfect solution for the Internet of Things, where sensor feedback can be used to trigger the execution of code. Companies such as Coca Cola, Thomson Reuters, Netflix and Expedia have already developed complex web apps based on a serverless architecture.

Serverless also enables continuous scaling, supporting highly parallelised pipelines of triggers and functions. This allows your web app to respond effortlessly at peak times. The point is that the platform scales for you without requiring any intervention.

With serverless you only pay for the compute time you use. Providers typically support sub-second metering, meaning that you are billed in milliseconds, not hours. You also don’t pay for servers which are idling, waiting for requests to come through. Well written code can drive substantial cost savings and potentially allow the creation of disruptive pricing models when competing against a web app written using more traditional methods.

Lambda by Amazon is probably the most mature serverless service provider at this time, although there are many others available including Microsoft Azure Functions, IBM OpenWhisk and Google Cloud Functions.

Serverless isn’t a magic bullet for every application. It runs out of a public cloud, which means that it is likely to be deemed inappropriate for some mission critical or sensitive applications. You are also tied to the languages a particular vendor supports (Python and Node are common) and are at the mercy of vendor updates to libraries. For example, if the platform is upgraded to support a newer version of your chosen development language, there is a risk that this will introduce bugs or other incompatibilities that break your code. Building an application in a distributed fashion can also introduce problems due to increased network latency between various components.  There is also the consideration of vendor lock-in, particularly if you become dependent on a particular set of third party services.

That said, serverless computing is definitely worth a look. If you’re a developer I’d encourage you create an account, build a function and have a play. You never know, you might find that a local datacentre will support serverless soon.

To view the original Isle of Man Today article, click here.