parthanaboina praveen
5 min readMay 27, 2021

--

Deploy Machine learning(ML) model in Docker container.

First we will understand what is machine learning(ML) and Docker.

Machine learning:- Machine learning is an application/program or subset of AI that allows machines to learn from data without being programmed explicitly.
Docker:- it is a tool that provides platform to create, run and deploy the applications by using containers. Docker is a bit like a virtual machine rather than creating a whole virtual operating system, Docker allows applications to use the same Linux kernel as the system that they’re running on and only requires applications be shipped with things not already running on the host computer. This gives a significant performance boost and reduces the size of the application.

Here are the complete steps:-
Step 1:
Install docker
#yum install docker-ce --nobest
if you have already installed check for query
#rpm -q docker-ce or docker — version

check status of docker
#systemctl status docker
if services are not enabled run
#systemctl start docker
Now, that we have docker service running on our RedHat 8 system. Now, we need to pull the centos image from dockerhub.
To pull any image
#docker pull image_name: version

So, now centos latest version image has been downloaded. Now, we can create container using this.
To create container we have to use below command
#docker run -it --name os2 centos:latest

“ it “option means interactive terminal. It will help us to interact with the os by providing the shell i.e terminal

Now, we have a new container created named as os2 with the image centos latest version. And you can check the cmd run success or fail with echo $? (Where 0 = sucess , other than 0 = failed

Now, we need to install python3 inside so that we can download various Machine Learning Libraries.
To do this run command

#yum install python3-pip

Now, install pandas library so that we can load the dataset. Run command pip3 install pandas

Now, it has also downloaded numpy. Now, we need to download scikit-learn library which provides functions to create ML models.
Now installing all required packages:

#pip3 install numpy
#pip3 install pandas
#pip3 install sklearn

Now, our base environment is ready. Now, we need to get the dataset inside the docker container.
Now, there are several ways. One of the easiest way is
Here, I have my dataset in my Windows Machine. So first of all, we would have to transfer dataset file from Windows to RHEL8 VM. To do this install WinSCP software.

Provide hostname as the IP Address of RHEL8 VM(check using ifconfig command). Provide username and password. Now, you can drag and drop.

Just drag and drop the file. Left side is your Windows Machine and Right side is RHEL8 VM.
Now, we have file in the Linux system at /root/ path
docker cp <SOURCEFILE_PATH> <CONTAINER_NAME>:<DESTINATION_PATH>
SOURCEFILE_PATH: Path to the file inside your baseOS i.e here RHEL8
CONTAINER_NAME: Path of the container name in which you want to transfer file.
Note: Container should be running.
DESTINATION_PATH: Path inside docker container where you wanted to copy the file from baseOS.

Now, our SOURCEFILE_PATH is /root/salary.csv
First, let us create a workspace in our docker container.
mkdir /root/salary-predict-App

Go to your baseOS. Open a new window and run below command.
docker cp /root/salary.csv os1:/root/salary-predict-App/

Now, its time to create a python script which can train our model and save the model in our workspace.
We have to create a file using vi or vim . Run command #vi salaryml.py

Now, when we run salaryml.py file our model will be created and saved inside our workspace i.e /root/salary-predict-App
Now, let us check whether it is working good or not. Run the file using command python3 salaryml.py

Now, we have model created and saved in this directory. Now, if we want to predict the salary for some years of experience. Create a file predict.py in the same workspace.

Now, we can run this script. It will first load the model, and predict the salary based on the years of experience we provided

--

--