Illustration of Docker whale and Kubernetes logo with text 'Containerization with Docker and Kubernetes Part 1

Containerization with Docker and Kubernetes – I

Share this post on:

Part -1

What is Containerization?

Containerization is a technology that allows developers to package applications and their necessary dependencies into a single, lightweight unit called a container. Think of a container as a box, where you can put all your code, so it runs effectively on any machine, production server or cloud. With the help of Docker, you can package your code and its dependencies into a container, and upon deployment, the container ensures that the application runs consistently across different environments, just as it does on your local machine. Containers also play a crucial role in CI/CD Development. Overall, with containerization you revolutionize your building process, shipping, running the application & the development process more agile & reliable.

Why is it Important?

Imagine telling your co-worker, “It works on my machine!” but it crashes on theirs. Containers solve this by making sure the app runs the same everywhere. This consistency helps developers create, test, and release apps faster.

WITHOUT CONTAINERS

without containers


WITH CONTAINERS

with containers

What is Docker?

Docker is an open-source platform designed to help developers to build, ship and run applications inside containers. Its main purpose is to make sure your application works efficiently in any environment.

Key features of Docker:

  • Lightweight and Portable Containers: Docker containers are small and light in weight compared to virtual machines, which makes them easy to move across different systems.
  • Version Control: Docker lets you track and manage different versions of your container images, making it easy to revert to previous versions if needed.
  • Layered File System: Docker’s layered file system speeds up building and updating containers by only changing the layers that need updating.

Docker Architecture:

  • Docker Client:The primary interface for Docker is the Docker Client. Users can interact with Docker through the client, using commands such as docker build, docker pull and docker run.
  • Docker Daemon (Docker Engine): The Docker daemon runs on the host machine and manages Docker objects such as images, containers, networks, and volumes. The daemon listens for Docker API requests and processes them.
  • Docker Images and Containers: Docker images are snapshots of your application and its dependencies. Running an image creates a container, which is the live instance of the image.
  • Dockerfile: A Dockerfile is a set of instructions that tells Docker how to build an image. It’s like a recipe for creating containers.

Getting Started with Docker on Windows

  1. Download Docker Desktop for Windows .
  2. Install Docker Desktop, on starting it you’ll see a whale icon in your system tray.
  3. Verify Installation: Open command prompt or PowerShell Window and run command docker –version to ensure Docker is installed correctly.

Getting started with Docker on Windows: Creating and Dockerizing a simple node.js Application.

Step 1: Create a Simple Node.js Application

  1. Install Node.js and Express.js : 
    Visit official node.js website, Download & Install node.js.
    Install express.js using this command: npm install express
  2. Set up your project:
    Open a Command Prompt or PowerShell window, create a new directory for your project and navigate into it:
mkdir my-docker-app 
cd my-docker-app 

3. Initialize a New Node.js Project:

  • Run the following command to create a package.json file:

npm init -y

4. Create your application code:

  • Create a new file named `app.js` in the project directory and add the following code:
const express = require('express');  
const app = express(); 
const port = 3000; 
app.get('/', (req, res) => {  
res.send('Hello, Docker!');  
}); 
app.listen(port, () => {  
console.log(`App running at http://localhost:${port}`); 
}); 

Step 2: Dockerize the Node.js Application

Create a DockerFile:
In the my-docker-app directory, create a file named Dockerfile (without any extension) and add the following content:

# Use an official Node.js runtime as the base image 
FROM node:14 
# Set the working directory in the container 
WORKDIR /app 
# Copy package.json and package-lock.json to the container 
COPY package*.json ./ 
# Install app dependencies 
RUN npm install 
# Copy the rest of the application code to the container 
COPY . . 
# Expose the application port 
EXPOSE 3000 
# Command to run the application 
CMD ["node", "app.js"] 
  1. Build the Docker Image:
    Open a Command Prompt or PowerShell window in the my-docker-app directory.
    Run the following command to build the Docker image:
    docker build -t my-node-app.
    This command tells Docker to build an image named my-node-app using the current directory (.) as the context.
  2. Run the Docker Container:
    Once the image is built you can run it as a container.
    docker run -p 3000:3000 my-node-app
    This command tells Docker to run the container and map port 3000 on your host to port 3000 in the container.
  3. Access the Application:
    Open your web browser and go to http://localhost:3000 to see the application running inside the Docker container. You can see the message “Hello, Docker!”.

Benefits of Docker

  • Portability: Docker makes it possible to create small, lightweight containers that can run on any computer, independent of the operating system.
  • Isolation: Use of the containers, provides a high level of isolation by letting the application work independently of each other, so one container can not affect the other container
  • Reproducibility: Docker developers may effortlessly bundle their dependencies and apps into reusable images. It makes it possible for builds to be uniform and reproducible in the testing, production, and development environments.
  • DevOps Integration: To manage the growing workloads, it encourages cooperation and automation throughout the software development life cycle.
Click here for Part 2: Taking Your Docker Skills to the Next Level! 🚀