Skip to main content

How to build next-gen applications with serverless infrastructure

Get answers to some of the most frequently asked questions about serverless architecture.
Image
aerial view of a city at night

The cloud has changed the game for building applications. Engineering teams no longer have to settle for monolithic apps or hard-to-manage infrastructure. And with the right cloud service provider, jumping into the modern era of application building is straightforward.

[ Read: Managed services vs. hosted services vs. cloud services: What's the difference? ]

In this post, I'll explain why legacy applications fall short and how teams can upgrade their services. I'll also answer some of the most common questions to give you the confidence to choose the right serverless solution for your organization.

Stateless vs. stateful architecture designs

Monolithic applications traditionally depend on stateful architectures, in which user session information is stored on a server rather than in external persistent database storage. This server-based session storage gets in the way of modern application design.

The problem with stateful session storage is that session information can't be shared across multiple services. Once a user establishes a connection to a particular server, they must stick with it until the transaction completes. IT teams also can't perform "round-robin" requests to distribute traffic across multiple services, and applications can't scale horizontally.

Fortunately, there is a better model that leverages stateless architectures. If companies put user session information on persistent database layers rather than local servers, the state can be shared between microservices. This unlocks a wide range of performance benefits and the potential to leverage service-oriented architecture (SOA).

[ Learn more about stateful vs stateless. ]

Service-oriented architecture (SOA)

Service-oriented architectures (SOAs) use components called "services" to create business applications. Each service has a specific business purpose and can communicate with other services across various platforms and languages. The goal of an SOA model is to implement loose coupling, enabling teams to optimize, enhance, and scale services individually. But the challenge with SOA is that it requires more nuanced governance.

Simple Object Access Protocol (SOAP) and Representational State Transfer (REST) are the two primary service architectures used to implement SOA with appropriate governance. SOAP is older, heavier, and relies on XML for data exchange. REST is newer, more efficient, and allows JSON, plain text, HTML, and XML data exchanges. The good news is that going the REST route is straightforward by using API management services provided by many cloud providers.

API management tools allow engineers to put RESTful endpoints in front of serverless functions and expose them as microservices. Going stateless with REST sets the stage for game-changing microservices-based applications.

Microservices best practices

The clear advantage of microservices is that discrete business functions can scale and evolve independently with dedicated IT infrastructure. Engineers can also maintain a much smaller surface area of code and make better choices when supporting individual services.

Best practices for designing microservices architecture include:

  • Creating a separate data store so that individual teams can choose the ideal database for their needs
  • Keeping servers stateless to enable scalability
  • Creating a separate pipeline for each microservice so that teams can make changes and improvements without impacting other services that are running
  • Deploying apps in containers to ensure standardization across services
  • Using blue-green deployments to route small percentages to new service versions and confirm that services are working as expected
  • Monitoring environments continuously to be proactive in preventing outages

When deploying this in a serverless fashion, engineers have much less infrastructure to manage themselves. They can focus more on application code and less on operating systems, server specifications, autoscaling details, and overall maintenance. Serverless microservices with stateless session storage using REST are the core ingredients of next-gen applications.

3 frequently asked questions about serverless architecture answered

Some aspects of this strategy can confuse new adopters, such as the need for containers and how to deploy event-driven applications. I'll touch on these concerns next.

1. Why do you need containers if you have instances?

When explaining the benefits of serverless infrastructure and containers, I'm often asked why you need containers at all. Don't instances already provide isolation from underlying hardware? Yes, but containers provide other important benefits.

Containers allow users to fully utilize virtual machine (VM) resources by hosting multiple applications (on distinct ports) on the same instance. As a result, engineering teams get portable runtime application environments with platform-independent capabilities. This allows engineers to build an application once and then deploy it anywhere, regardless of the underlying operating system.

Deployment cycles are also faster with containers, and leveraging infrastructure automation is easier. It's possible to run different application versions simultaneously and package dependencies and applications in a single artifact.

[ Learn more about cloud-native development in the eBook Kubernetes Patterns: Reusable elements for designing cloud-native applications. ]

2. What happens if a particular service goes down?

Good question. I recommend using tools like messaging queuing services designed specifically for serverless applications.

A queue-based architecture prevents potential data loss during outages by adding message queues between services (REST endpoints). These queues hold information on behalf of services.

3. What about event-driven architecture?

Implementing event-driven architecture (EDA) can work for serverless infrastructure through either a publisher/subscriber (pub/sub) model or an event-stream model. With the pub/sub model, notifications go out to all subscribers when events are published. Each subscriber can respond according to whatever data processing requirements are in place.

On the event-stream model side, engineers set up consumers to read from a continuous flow of events from a producer. For example, this could mean capturing a continuous clickstream log and sending alerts if any anomalies are detected.

Wrap up

Stateful legacy applications have brought application development a long way, but the wave of the future is certainly serverless applications deployed in a service-oriented architecture. Key components include microservices and containers. As enterprises continue to transition from legacy applications, architects must understand and prepare for the serverless infrastructure to build next-gen apps.

[ Want Red Hat OpenShift Service on AWS (ROSA) explained? Watch this series of videos to get the most out of ROSA. ]

Author’s photo

Anthony Loss

Anthony Loss is a Lead Solutions Architect at ClearScale. He has nearly a decade's experience helping clients reduce IT expenditures, increase business agility, accelerate innovation and mitigate risk. More about me

Navigate the shifting technology landscape. Read An architect's guide to multicloud infrastructure.

OUR BEST CONTENT, DELIVERED TO YOUR INBOX

Privacy Statement