Mastering DevOps: A Step-by-Step Introduction

ByteBusterX
6 min readNov 4, 2023

--

What is DevOps?

Defining DevOps is not a trivial task but the term itself consists of two parts, Dev and Ops. Dev refers to the development of software and Ops to operations. Simple definition for DevOps would be that it means the release, configuring, and monitoring of software is in the hands of the very people who develop it.

A more formal definition : “DevOps is a development methodology aimed at bridging the gap between Development and Operations, emphasizing communication and collaboration, continuous integration, quality assurance and delivery with automated deployment utilizing a set of development practices”.

Sometimes DevOps is regarded as a role that one person or a team can fill. Here’s some external motivation to learn DevOps skills: Salary by Developer Type in StackOverflow survey.

DevOps specialist is third highest paying job all over the globe.

You will not become a DevOps specialist solely , but you will get the overview to help you navigate in the increasingly demanded profession.

DevOps Engineer Roadmap:

https://roadmap.sh/devops

1. Foundation Skills:

  • Learn a Programming Language (e.g., Python, Ruby, Bash).
  • Gain proficiency in Git for version control.

2. Infrastructure as Code (IaC) and Configuration Management:

  • Learn Terraform for IaC.
  • Familiarize yourself with Ansible for configuration management.

3. Containerization and Orchestration:

  • Master Docker for containerization.
  • Learn Kubernetes for container orchestration.

4. Cloud Platforms:

  • Gain proficiency in AWS, as it’s widely used in the industry.

5. CI/CD Pipelines:

  • Learn Jenkins for setting up continuous integration and continuous deployment pipelines.

6. Monitoring and Logging:

  • Familiarize yourself with tools like Prometheus, Grafana, ELK Stack, or others for monitoring and logging.

7. Optional Advanced Skills:

  • Depending on specific requirements, explore areas like database management, automated testing, advanced networking concepts, etc.

8. DevOps Mindset:

  • Develop a mindset focused on collaboration, automation, and continuous improvement.

9. Gain Experience:

  • Work on DevOps projects, either personal projects or contributing to open-source projects.
  • Seek internships or junior positions that allow you to apply your skills in a real-world setting.

10. Certifications:

  • Consider obtaining relevant certifications from AWS, Docker, Kubernetes, or other cloud/containerization providers.

11. Resume and Portfolio:

  • Highlight your skills and experiences in your resume. Showcase any projects you’ve worked on, even if they’re personal projects.

12. Job Search and Networking:

  • Apply for DevOps Engineer positions, and leverage your network to discover opportunities.
  • Attend meetups, conferences, or online communities related to DevOps to network with professionals in the field.

13. Continuous Learning:

  • Stay updated with industry trends, new tools, and best practices. DevOps is a rapidly evolving field.

14. Interview Preparation:

  • Brush up on common DevOps interview questions and be prepared to demonstrate your technical skills and problem-solving abilities.

Remember, the roadmap is a guideline and not a strict path. Tailor it to your own learning pace and interests. Additionally, gaining hands-on experience is crucial, so try to apply what you learn in practical projects whenever possible.

Bonus Github repo for learning devops :- https://github.com/MichaelCade/90DaysOfDevOps

Bonus tip if you are learning all the DevOps stuff learn it in the cloud do all the practice in cloud. I personally using AWS for practice my skill and most importantly it’s almost free. If you using it for high performance work you need to pay little.

The most important skill you need to learn is container or Docker so the simple overview of docker is as below:-

What is Docker?

“Docker is a set of platform as a service (PaaS) products that use OS-level virtualization to deliver software in packages called containers.” — from Wikipedia.

So stripping the jargon we get two definitions:

  1. Docker is a set of tools to deliver software in containers.
  2. Containers are packages of software.

The above image illustrates how containers include the application and its dependencies. These containers are isolated so that they don’t interfere with each other or the software running outside of the containers. In case you need to interact with them or enable interactions between them, Docker offers tools to do so.

Benefits from containers

Containers package applications. Sounds simple, right? To illustrate the potential benefits let’s talk about different scenarios.

Scenario 1: Works on my machine

Let’s first take a closer look into what happens in web development without containers following the chain above starting from “Plan”.

First you plan an application. Then your team of 1-n developers create the software. It works on your computer. It may even go through a testing pipeline working perfectly. You send it to the server and…

…it does not work.

This is known as the “works on my machine” problem. The only way to solve this is by finding out what in tarnation the developer had installed on their machine that made the application work.

Containers solve this problem by allowing the developer to personally run the application inside a container, which then includes all of the dependencies required for the app to work.

  • You may still occasionally hear about “works in my container” issues — these are often just usage errors.

Scenario 2: Isolated environments

You have 5 different Python applications. You need to deploy them to a server that already has an application requiring Python 2.7 and of course none of your applications are 2.7. What do you do?

Since containers package the software with all of its dependencies, you package the existing app and all 5 new ones with their respective Python versions and that’s it.

I can only imagine the disaster that would result if you try to run them side by side on the same machine without isolating the environments. It sounds more like a time bomb. Sometimes different parts of a system may change over time, possibly leading to the application not working. These changes may be anything from an operating system update to changes in dependencies.

Scenario 3: Development

You are brought into a dev team. They run a web app that uses other services when running: a Postgres database, MongoDB, Redis and a number of others. Simple enough, you install whatever is required to run the application and all of the applications that it depends on…

What a headache to start installing and then managing the development databases on your own machine.

Thankfully, by the time you are told to do that you are already a Docker expert. With one command you get an isolated application, like Postgres or Mongo, running in your machine.

Scenario 4: Scaling

Starting and stopping a Docker container has little overhead. But when you run your own Netflix or Facebook, you want to meet the changing demand. With some advanced tooling that we will learn about in parts 2 and 3, we can spin up multiple containers instantly and load balance traffic between them.

Container orchestration will be discussed in parts 2 and 3. But the simplest example: what happens when one application dies? The orchestration system notices it, splits traffic between the working replicas, and spins up a new container to replace the dead one.

Virtual machines

Isn’t there already a solution for this? Virtual Machines are not the same as Containers — they solve different problems. We will not be looking into Virtual Machines in this course. However, here’s a diagram to give you a rough idea of the difference.

The difference between a virtual machine and Docker solutions arises after moving Application A to an incompatible system “Operating System B”. Running software on top of containers is almost as efficient as running it “natively” outside containers, at least when compared to virtual machines.

So containers have a direct access to your own Operating Systems kernel and resources. The resource usage overhead of using containers is minimized, as the applications behave as if there were no extra layers. As Docker is using Linux kernels, Mac and Windows can’t run it without a few hoops and each have their own solutions on how to run Docker.

we will cover all aspects of docker in upcoming blog.

Keep the bytes, BBX signing off..

--

--

ByteBusterX

"Tech enthusiast exploring cybersecurity and networking. Sharing insights through the power of words. Join me in the world of tech and discovery. 📚✍️