Traditional application architectures have long used the standard application server deployment processes. However, usage of these servers is declining because these processes are archaic, complex and inefficient. These servers cannot support modern DevOps methods, such as continuous delivery, which requires multiple application deployments a day.
As more organizations move to a cloud-native architecture, many are exploring containerization, a development process that offers beneficial underlying architecture especially suitable for hybrid cloud and multicloud scenarios.
Pros and cons of containerization
Containerization involves encapsulating software code and all its dependencies so it can run smoothly on any infrastructure—whether it is a mix of public clouds or an on-premises data center—and share access to a host operating system vs. bundling in a copy of an OS. It also allows developers to create, deploy and troubleshoot applications faster.
However, while containerization is based on the promise of reusability and availability, oftentimes organizations struggle to understand and implement containers at scale.
The cost of internal resources is massive, and containers are often built on product teams with large communication gaps. These teams span various departments and areas of focus, including infrastructure, security, development, quality assurance and even legal team members. Because many organizations have not updated their IT operations processes accordingly to support container delivery (CD), it is common for stakeholders to have different practices and priorities.
More challenges arise when the underlying infrastructure is not modernized to support containers. Containers do best with technology optimized for the cloud, in which compute, storage and networking are each elastically scalable. If an organization’s infrastructure lags behind in modernity, it will never achieve a harmonious relationship with containers.
Many organizations also underestimate the efforts required to move from monolithic applications to containers, and overlook the need for Day-2 operations of containers: monitoring, orchestration, security and other tasks.
Build applications using serverless technology
Is it possible to bypass containerization (from the R&D/development perspective) and achieve faster development and similar benefits without compromising on flexibility?
Enter serverless technology and infrastructure as code (IaC).
Building applications using serverless technology allows developers to abstract away the complexities of infrastructure and scripting while focusing on building highly scalable applications. While applications that scale typically push state to the database and to the browser, this frees up the middleware and integrations layer to simply handle and process requests without constraining memory to remember user session information. This, in turn, forces developers to build applications that are truly stateless.
As organizations continue to modernize their applications, some applications can benefit from a serverless deployment. This will, in turn, reduce the complexities of IT operations by abstracting away core infrastructure. Let’s look at the standard progression of application modernization.
Build and modernize your applications to scale
The traditional simplified web application architecture consists of the application session being stored at the application server level. This means the application must either remember where the user’s session information is or replicate the session across an application server. This presents scalability challenges as you add more application servers to your environment to handle the increasing user load.
Standard Application Architecture
Next, some organizations move to containerization in order to modernize their legacy applications.
Some choose to do a complete “monolithic” migration where they package the monolithic application into a container and use Kubernetes to handle the orchestration of said containers. However, this does not solve the session management issues because the load balancer must “remember” to send traffic to the corresponding application server or the user’s session is “forgotten” and the user must log back into the application.
Again, scaling the application may be easier but the application consumes memory even when the user is not using the application.
Next, we find application modernization takes another turn.
If application teams embrace containerization completely, they will build out microservices using containers. This process abstracts away the old monolithic approach to development and deployment and embraces more sophisticated deployment principles like canary and blue green deployments.
The road to microservices is not and easy one. You’ll typically find an API gateway, complex CI/CD processes, sophisticated deployment patterns and siloed organization teams, whose communication gaps delay the progression of innovation within your organization.
If the true purpose of agile is to fail fast, this architecture is closer to that goal, but it takes time to perfect the subtle nuances of this type of implementation.
Containerized Microservices Architecture
Is it possible to build highly scalable applications without containerization?
Yes, if you take the traditional modernization approach of a monolithic application and create stateless, serverless functions, you can build such systems. In doing so, you can reduce IT operational costs by limiting continuous memory consumption of your applications.
You also have the ability to reuse this architecture across cloud vendors. Take the following architecture:
Complete Serverless Architecture
With this architecture, you can achieve near “infinite” scalability for your application, remove http servers, remove containers, remove application server deployments and configurations, and perform continuous versioning of your functions.
Let’s assume you have an existing application written in node.js with express.js. We can include serverless into our functions and call a handler directly without spinning up an express http/s endpoint. You maintain the ability to version your deployments and have reusable functions (as microservices) to stitch together other functions needed to build compelling new functionality for ever-changing business needs.
Safeguard and accelerate serverless application modernization efforts
It’s critical to get this security implemented correctly and to understand the full implications of building stateless applications before proceeding down this path. The other critical part to handle high scalability is using a single table architecture with primary and sort keys. This limits table scans and provides quick access to information. Key overloading is one technique that is used but there are even more patterns and architectural decisions that should be made before proceeding down this path.
Serverless technology is supported by all major cloud providers, who handle the routine work of provisioning, maintaining, and scaling the server infrastructure. There are also resources like Red Hat OpenShift that aid in the development of serverless applications.
If you are interested in utilizing serverless in your application modernization efforts, we recommend choosing a technology partner to help you accelerate the process and ensure you have the correct techniques and safeguards in place to reduce security risks and challenges during deployment.