Report writing for fundamental of computing (week 8)

 Topic DO1: Container Technology

On the 8th week we learnt about container technology which covered sub topic such as containers, Linux container, container Vs virtualization, benefits of Container, and docker container.

Containers;

-A container is a self-contained software package that includes not only the application code but also all the necessary components it relies on. This ensures consistent and efficient execution of the application across various computing environments. 

-Containers encapsulate the code, runtime, system tools, libraries, and configuration into a single, portable image, simplifying deployment and maintaining a consistent environment.


Linux Containers:

- A Linux container is a group of isolated processes, supplied with all required files from a separate image. This portability ensures consistency and speed when transitioning from development to production, making them faster than traditional testing environments.


Containers Vs Virtualization;

In virtualization, a streamable package is a virtual machine, comprising both the application and a full operating system. Each VM is self-contained, running on a hypervisor.


In contrast, containers run multiple applications on a single server, all sharing a common operating system kernel. Shared components are read-only, and each container has its own space to write data, making them much lighter and resource-efficient compared to virtual machines.



Benefits of Containers;

1. Portability: Containers package an application and its dependencies, ensuring consistency across different environments. This portability simplifies deployment and minimizes compatibility issues.

2. Resource Efficiency: Containers share the host's operating system kernel, reducing resource overhead compared to virtual machines. This efficiency results in faster startup times and more efficient resource utilization.

3. Isolation: Containers provide process-level isolation, keeping applications separate while sharing the same OS kernel. This isolation enhances security and makes it easier to manage dependencies.

4.Scalability: Containers are designed for scalability. They can be quickly replicated and orchestrated to meet changing workloads, making them ideal for microservices architectures.

5.Rapid Deployment: Containers start and stop quickly, enabling rapid application deployment and updates. This agility is valuable for continuous integration and continuous deployment (CI/CD) pipelines.

6.Version Control: Container images can be versioned, allowing for easy rollbacks and maintaining a history of application configurations.

7. Ecosystem Support: Containers have a rich ecosystem of tools and orchestration platforms like Docker, Kubernetes, and others, making it easier to manage and automate containerized applications.

8. Resource Isolation: While containers share resources, they can still be configured to limit and control resource usage, preventing one container from impacting others.

9. Developer Productivity: Containers promote consistency between development and production environments, improving collaboration and reducing "it works on my machine" issues.

10. Cost-Effective: Due to their efficiency and scalability, containers can reduce infrastructure and operational costs, making them cost-effective for cloud and on-premises deployments.


Docker;

docker


Docker is an open-source software that streamlines application development by providing isolated virtualized environments for building, deploying, and testing applications. While it's relatively straightforward, there are some Docker-specific terms to grasp:


1. Dockerfiles: Text files that define an application's configuration, dependencies, and steps for building a Docker image.

2.Images: Snapshots of application environments created from Dockerfiles. They contain everything needed to run an application, such as code, libraries, and configurations.

3. Containers: Runnable instances of Docker images. Containers are isolated, lightweight, and can be easily started, stopped, and managed.

4.Volumes: Data storage elements that allow data to persist across container lifecycles. Volumes are used for storing application data independently of containers.

5. Learning Curve: While Docker is user-friendly, understanding Dockerfiles, images, containers, and volumes is essential for efficient utilization. Mastery of these terms accelerates the learning process.


Terminology:

A Dockerfile is a textual guide that outlines steps for constructing a Docker image. It defines the base image, software installation, configurations, and execution commands, enabling the creation of a customized container image.

Docker Hub is a cloud-based registry service for Docker images. It serves as a platform for users to store, share, and deploy Docker images. As the largest container registry globally, it boasts over 18 million images and more than 20 billion monthly downloads, fostering collaboration and accessibility in the Docker community.

Docker image is a file used to execute code in docker container.

Docker container is standardized, executable components that combine application source code with the OS libraries and dependencies required to run that code in any environment.

Docker volume is a persistent storage location that exists outside of the container.


Comments

Popular posts from this blog

Report writing for fundamental of computing (week 12 and 13)

Report writing for fundamental of computing (week 14)