ArrowLinkToArchivePageBlog

It’s electric – Will serverless be the next big thing in 2019? Serverless architecture

It is now the end of 2018, and it finally looks like the world got the message regarding the cloud. We are happy to have fully embraced the cloud from both perspectives: as a customer who adopted the cloud-first strategy years ago; and as a consulting company delivering solutions to our customers who often choose cloud-based services.

KEY POINTS:

  • What’s next for app architecture?
  • How does serverless architecture work?
  • What are its key services in Azure?

Let’s ask ourselves: what’s ahead for us in 2019 and beyond? 

Let’s face it, there is no single solution that fits all. Organizations are at different maturity levels, and are fighting different levels of technical debt and inertia This affects how we adopt new technology: how we view it and what works best for us.  

There are some common elements and trends that often surface and affect how we think about solutions. They also affect what our work environment and how applications will look like in the end. One such trend is that visible and on-prem server infrastructure and support models are becoming rare. They are all but disappearing from the picture.

What do you mean? Will my servers be gone?  

You might disagree with this approach. You might still be maintaining most of your solutions on servers in a co-location or at your office. Trust me, don’t hesitate, move your applications to Azure IaaS or other cloud provider as soon as you can. You will still continue to work with servers, but with new future-proof processes and methodologies.

Don’t worry, old-fashioned servers wont disappear overnight. They will likely still be around for a long time, as there are many solutions, apps and requirements that need to run on legacy infrastructure.

Remember, the server is invisible for the end-user and consumer of our solutions. They have no idea, nor do they care, what type of system or infrastructure they are accessing.

How does it work?

Remember a decade ago, when you were setting up your email client, you had to enter a server name, IP, port, and a bunch of other information just to get email to work? Today, all you have to do is select an email provider (Google, Exchange, or iCloud) and enter your credentials.

The same goes for your business applications and solutions. In the past, an end-user may have even asked you for the IP of a server to connect to. That’s because back then, using the solution meant understanding the architecture and how it was deployed.

Today deploying and your solution is closer to plugging your device into a power grid. You plug in your appliance and don’t think about how it works–as long as it works.  

Tip #1: The servers and/or infrastructure that you use for your application give your organization little, if any, advantage and become invisible for end-users. Don’t spend too much time thinking about or investing in them, unless you are in a niche market or have unique requirements for control, privacy or separation of services!  

Did we kill servers with containers? 

Containers are our industry’s new darlings, for for a good reason.

For those who don’t know, containers allow you to deploy your application together with key dependencies like libraries, additional software, and even configuration. Like a launch pad, they provide isolated areas for running your applications.

They offer many benefits: provide a way to better utilize hardware, support developers and operations in automation, isolate running processes, streamline the deployment of applications, abstract the complexity away from developers and separate the accountability from those who configure the container, and much more.

You can imagine how much easier it is for us to work with the cloud services when using containers. This really is the preferred use of cloud services for development and successful adoption. I wrote about it in one of my previous posts. 

Sounds great, so what’s the catch?

Containers make deployment easier and speed up development and testing. But you still need to pay close attention to the infrastructure underpinning your application. 

Containers make it easier to scale. You can easily deploy, delete and rebuild your solution again. It is easy to script it, automate deployments and plug into automation pipelines.  

You still need to think about planning it, maintaining it, upgrading versions and patching it.  

I was painfully reminded with the latest bug in Kubernetes (container orchestration platform) where all admins had to scramble to get their platform up-to-date rather than working on end-user features and business value.  

Tip #2: However detached servers or automated containers and their orchestration platform are, they are still another manifestation of infrastructure, which will become more and more invisible to the user and developer

We now have managed services offered by Azure and other providers. Azure Kubernetes Service makes it simpler to run and deploy your containers, but you are still working with the infrastructure to enable it.  

 Where is it all headed?  

Consumption-based, serverless apps!  

By now you must be thinking that I drank too much serverless Kool-aid!

If you look closely, the trend of consumption-based serverless apps is gaining traction. Many technology vendors and consumers are utilizing this model.

Serverless computing is a cloud-computing execution model in which the cloud provider acts as the server, dynamically managing the allocation of machine resources. Pricing is based on the actual amount of resources consumed by an application, rather than on pre-purchased units of capacity. It is a form of utility computing.

Source: Wikipedia 

There are still “servers” in a serverless offering. Your code needs to run somewhere, and all the components still need to be maintained.

In this model, servers and infrastructure disappear from your value chain and from your business application. They hidden from your management layer, and it is not your responsibility or business anymore to manage or build them.  

The keyword here is utility. Suddenly, computing is becoming like electricity: you plug in and use.

Seems simple enough…

Do you know where your electricity comes from when you use it? Or how it is delivered?

Probably not!  

Then why should you care about computing resources running your code and providing business functions? Why should you worry about running them better than providers like Microsoft, Amazon or Google who have invested millions in the cloud infrastructure and built their businesses on it? 

Serverless service models based on consumption and on-demand code execution mark a major shift in business process and applications. They provide the ability to do the following benefits:  

  1. Run on-demand business processes at a fraction of the cost needed to maintain on-premise server architecture and processing power ready to run only when needed.  
  2. Introduce a very detailed per-process billing model: each business function or process can have a separate cost attached to it.  

Tip #3: The serverless model is bringing a major change in the way we build, operate and calculate the cost of running business processes and applications. It requires a new approach to software architecture and development.  I recommend that you invest in the skillsets required to properly operate cloud-based, on-demand systems. This trend will only accelerate–and you want to be in the loop.  

You might be thinking that serverless is only a fad. Remember, not more than a decade ago some said that no one will use the public cloud. Ever earlier, some even claimed that an iPhone / touch screens were just a trend too. History likes to repeat itself!  

How to start? Where to look?  

At the moment, serverless is mostly connected to specific services like Azure Functions. After all, that’s where it started. This approach is more about architecture and patterns in software than a particular service or technology.  

Over time new patterns and services will emerge. These will be likely be built on top of what we now consider serverless.  

Tip #4: Serverless is not a particular technology or service. It is an architectural and business pattern of services design, architecture, and billing.  

Here are five Azure Services that create the crucial backbone of serverless applications. 

Azure Functions  

Azure Functions provides a serverless compute environment. You can execute your code written in many languages, on-demand, and get billed only for the resources that it consumes. Start it up when needed and shut down when it’s finished.  

It can be triggered, on-demand, by events or based on intervals. It will do the job and then shut down when done without provisioning of the infrastructure, networking or other interaction from you. Just provide the code to execute.  

Another thing that you may want to check out is Azure Functions extension. These are also called Durable Functions.  They allow you to write stateful functions (think about it as chaining the functions and putting them in order of execution). It does come in handy in some scenarios. 

Azure Logic Apps  

Where Azure Functions provides serverless compute resources, Azure Logic Apps provide a serverless workflow environment where you can graphically create the entire process.  

The workflow can span across multiple elements, execution of code, and connections to the specific services and integration points.  

One of possible use of  this feature is to call Azure Functions to process data. You can also run a container with your logic as part of the Azure Logic Apps workflow.  

This is a very powerful tool for building application flows quickly. If you want to see an example of it in action, you can check our enterprise social network analytics, built and run in the serverless environment 

Azure Event Grid  

Where Azure Function executes your code, Azure Logic Apps can execute your entire workflow. You need something to bind them together with multiple signals and events. This is where Event Grid comes into the picture.  

The Event Grid combines multiple sources of events as Topics. It also allows you to catch these events and trigger actions through Event Subscriptions 

The receiver of these subscriptions might be your Azure Function or Logic Apps. It might also be other services like Storage Queue, which can trigger further Azure Functions down the road.  

Event Grid is a switchboard for events in your serverless architecture.  

Azure API Management  

When it comes to the cloud, there is always a question of security and access control. All these components mentioned above address this within Azure platform security model.  

In case you need an additional layer of publishing and access control, this is where Azure API Management can can help.  

Because serverless execution models are gaining prominence, Microsoft adjusted Azure API management with a new consumption-based pricing model. You don’t need to provide the service up front. Only execute it when needed for your serverless apps.   

Azure Cosmos DB  

There are a few different storage mechanisms you can use in your serverless architecture based on Azure.  

However, databases are still the way to go for many things. For now, Cosmos DB is not a serverless service itself. It provides capabilities that can greatly improve the way you build your serverless applications with an event-driven approach 

Tip #5: Serverless applications are not the same as standard architecture of apps you used in the past. There are many elements you need to consider when designing them. Do not approach this process in the same way as before and educate yourself in this area before going into it fully. 

So, what do you think?  

So about that prediction for 2019 that I mentioned at the beginning:  

  • In 2019 we will see more focus on cloud vendors releasing new services and adjusting existing services for serverless architecture. It was evident at Amazon AWS’ latest Invent conference, and I expect the same on Microsoft Azure events next year.  
  • We will also see more applications built from scratch in a serverless way. Today, many are still experimenting, but patterns are emerging.

Does that mean servers will go away?  

I don’t think so.  

We will still run IaaS machines for a while yet, either because of legacy apps or because of cases where computing power and type of operations forwards to VMs or containers. But in many instances, serverless will be the architecture of choice 

Get ready for the the shift by sharpening your skills and informing your organization. Remember, if you need help, we will be happy to assist you on your serverless journey.  

More than 20 years of experience on IT market taught me that very rarely someone is completely right or wrong about something. We make good bets based on our biases, experience and observations.

The serverless approach is emerging and gaining traction. It is visible in the media, at the conferences and in vendor announcements. 

OK, Let’s make a deal!   

For you – please bookmark this article and put a task on your to-do list to check it a year from now.  
For me – a year from now I will come back to this topic and provide an update on the status of serverless and how it developed through 2019.  

See you next year!

KEY TAKEAWAYS:

  1. Architecture is headed in the serverless direction in the near future.
  2. Because all operations are performed in the cloud, serverless simplifies on-site architecture, costs and security aspects of running and maintaining your apps.
  3. Azure provides 5 services that can help you build and manage your app easily in the cloud: Functions, Logic Apps, Event Grid, API Management and Cosmos DB.
  4. Preparation is everything. Before adopting serverless, gain the required skills and prepare your organization for the coming change. Feel free to contact us for help!

Sign up for Predica Newsletter

A weekly, ad-free newsletter that helps cutomer stay in the know. Take a look.

SHARE

Want more updates like this? Join thousands of specialists who already follow our newsletter.

Stay up to date with the latest cloud insights from our CTO