Over the last decade, DevOps has become an important part of software engineering culture, influenced by the wide adoption of microservices, containers and cloud computing. A recent step in the evolution of cloud-based and microservice architecture is the serverless computing – a code execution model where the cloud provider takes total responsibility for the operating system and hardware management. The purpose of serverless computing is to simplify an operations part of DevOps and to provide a scalable execution architecture and a predictable pricing model, where platform users pay only for the computing they use without the need to pay for idle resources.
Let’s see how Serverless can be useful in DevOps practices.
This concept is called Serverless Compute or Function as a Service (FaaS). It is defined as a cloud computing execution model where logic runs in the stateless containers that are event-triggered and fully managed by third-party platforms. Serverless does not mean that there is no server-side logic or servers in general. It emphasizes that developers can leave most of the operational tasks related to server maintenance such as operating system updates, fault-tolerance, scalability and monitoring to the cloud provider.
From the software architecture perspective, DevOps has an impact on the development cycle including build, test, deployment, and monitoring phases. In addition, DevOps literature usually includes source control practices because they play a crucial role in code sharing and team collaboration. Most of the DevOps automation practices can be grouped into CI, CD and Monitoring categories.
The common pattern observed is that cloud-based architecture encourages the use of DevOps through the decomposition of the system into smaller and more manageable components, leading to smaller teams and simplifying decision making compared to large monolith systems. Serverless computing in practice also requires CI/CD pipeline and maintenance operations.
Serverless architecture brings advantages to the DevOps process. From the very beginning, it offers operability, as the serverless approach already combines Dev and Ops and even erases the difference between Dev and Ops specialists.
The DevOps challenge for Serverless applications includes complicated local debugging because the applications are executed in a cloud and are usually tightly coupled with other cloud services. Serverless platforms provide limited access to the execution infrastructure. Developers no longer have access to the servers needed for monitoring the behavior of applications on the operating system level.
It is possible to implement an entire build, test and deploy pipeline by writing the glue code using serverless services, without using any hosted solutions. The resulting infrastructure can be easily modified because Serverless computing implies that configurations and business logic are stored together in the same repository. This close connection between the business logic and the infrastructure together with the atomic nature of serverless functions makes deployments and rollbacks simple with the help of Infrastructure as Code (IaC) tools that help to provision cloud platforms and data centers through the declarative description of used services and their interconnections.
As you know, S3 only allows for a single replication destination. To replicate the same data to multiple regions, you can use Lambda with the S# bucket as an event source.
To access resources in your VPC in your firewall, you can use the Cognito-protected API Gateway endpoint with a Lambda function to proxy requests.
Many times developers have to perform a handful of tasks when an EC2 instance is terminated like removing a Route53 entry. By hooking a Lambda function up to the Cloudwatch Event Bus you can ensure that this gets done.
DevOps pipeline to guarantee high quality of the code through the build process, QA and deployment automation, source control and monitoring. The DevOps pipeline components are shown in the below image.
When the engineers push the code to the GitLab Server, it triggers the GitLab Runner. The GitLab Runner executes CI and CD pipelines that consist of jobs. The Job is the GitLab term that describes an activity in a CI/CD pipeline. The jobs are run using Docker containers with the required build environment, including the following tools: npm, Node.js, and Serverless Framework. The images of the Docker containers are stored in the GitLab Container Registry that is also a part of the GitLab suite.
The figure shows that the GitLab suite, including Runner instances, is deployed and works on the company’s premises. However, the case project, having a cloud-oriented architecture is deployed on the AWS cloud infrastructure.
This pipeline can be scaled up for implementing new services. The approach with atomic deployments of the services assume that every service can follow its own release roadmap and be deployed independently.
Serverless DevOps goes beyond how IT organizations can achieve greater business agility. It’s geared towards the rapid delivery of business value and continuous improvement and learning. Organizations can deliver new products and features quicker and cheaper to change the culture in the process.
If you would like to be a guest contributor to the Stackify blog please reach out to [email protected]