Created by Dipaditya Das

Deploy Machine Learning Model on Docker

Dipaditya Das
Geek Culture
Published in
6 min readMay 27, 2021

--

In this article, we are going to create our Machine Learning Model in Python Programming Language.

Applications are getting more complex. Demand to develop faster is ever-increasing. This puts stress on your infrastructure, IT teams, and processes. Linux® containers help you alleviate issues and iterate faster — across multiple environments.

What are Linux containers?

Linux containers are technologies that allow you to package and isolate applications with their entire runtime environment — all of the files necessary to run. This makes it easy to move the contained application between environments (dev, test, production, etc.) while retaining full functionality. Containers are also an important part of IT security. By building security into the container pipeline and defending your infrastructure, you can make sure your containers are reliable, scalable, and trusted.

Why use Linux containers?

Linux containers help reduce conflicts between your development and operations teams by separating areas of responsibility. Developers can focus on their apps and operations teams can focus on the infrastructure. And, because Linux containers are based on open source technology, you get the latest and greatest advancements as soon as they’re available. Container technologies — including CRI-O, Kubernetes, and Docker — help your team simplify, speed up, and orchestrate application development and deployment.

What is Docker?

Docker is a tool designed to make it easier to create, deploy, and run applications by using containers.

Containers allow a developer to package up an application with all of the parts it needs. We can download all the dependencies, required for machine learning.

In this article, we are simply going to make a Linear Regression Model on the Salary Dataset using Python.

Image by Dan White 1000 on Shutterstock

We are going to do all our practicals in RedHat Linux Enterprise 8 (RHEL8).

Step — 1: Configuring Yum Repository for Docker

As we can see that we have created a docker-ce.repo file and we have provided name, baseurl(official repository link), and gpgcheck. We disabled the software signature checking as we have provided the official link of Docker Community Edition.

Step — 2: Installing Docker Community Edition

In order to install Docker Community Edition, we will write the following command.

yum install docker-ce --nobest -y

In RHEL 8 and Fedora, yum is the package manager. We have all used nobest option, which helps to skip any unsupported version of Docker Engine.

Step — 3: Enabling the Docker Service in RHEL 8

systemctl is-active docker

This command helps us to check whether the docker service is running or not.

systemctl enable docker --now

This command helps us to start and enable the docker service so that after rebooting the docker service will be available to us.

systemctl status docker

This command helps us to check the status of the docker service in detail.

Step — 4: Enable Firewall for Docker Engine

Before launching the containers on top of Docker Engine we have to make sure that the ingress and egress traffic is enabled for the Containers.

In order to do that we need to enable masquerading. Focusing on firewalling, I realized that disabling firewalld seemed to do the trick, but I would prefer not to do that. While inspecting network rules with iptables, I realized that the switch to nftables means that iptables is now an abstraction layer that only shows a small part of the nftables rules. That means most - if not all - of the firewalld configuration will be applied outside the scope of iptables.

Long story short — for this to work, I had to enable masquerading. It looked like dockerd already did this through iptables, but apparently, this needs to be specifically enabled for the firewall zone for iptables masquerading to work.

Step — 5: Pulling the Latest Centos Docker Image

docker images

This command helps us to check the images present in our local system.

docker pull centos:8

CentOs version 8 is the latest image of CentOs available in DockerHub. We will pull the image from the public repository.

Step — 6: Create a Docker Container

Now, with the help of docker ps command, we see what container is running state.

docker create -it --name MLOps centos:8 /bin/bash

The above command creates a container name MLOps of CentOs version 8 providing an interactive bash terminal. After it gets created, we need to start our container and for that, we will write the following command.

docker start MLOps

After starting the MLOps Container, we need to use the interactive terminal and for that, we need to use the following command.

docker attach MLOps

Now we are inside the container and we can see the OS is ‘centos’ and the version is ‘8’.

Step — 7: Installing Python and Git

yum install python38 git

This will install python-3.8 and the latest git software in outr system.

Step — 8: Git clone the Python Code

We will clone the GitHub Repository where our Python code is available.

git clone <Repository_URL>

Step — 9: Creating our LinearRegression Model

We have created a python program named Model.py.

Created by Dipaditya Das

First, we install all libraries required for our program to run properly with no errors.

Now we are going to execute our program with a Python3 Interpreter.

After execution, we will see that a pickle file named “SalaryModel.pkl” has been created.

In the “predict_model.py”, we have loaded the pickle file and provided the User Input, and printed the result in an appropriate fashion. After execution, it looks like this.

So we finally deployed our model and run it.

Bonus

We can stop the container using the following command.

docker stop MLOps

We can delete the container using the following command.

docker rm <container_id_MLOps>

I would like to thanks Vimal Daga Sir for giving an important topic to research and do the practical and figure out the importance of this architecture.

“Reading… a vacation for the mind… ” — Dave Barry

--

--

Dipaditya Das
Geek Culture

IN ● MLOps Engineer ● Linux Administrator ● DevOps and Cloud Architect ● Kubernetes Administrator ● AWS Community Builder ● Google Cloud Facilitator ● Author