Made using Canva

โšœ ๐— ๐—ฎ๐—ฐ๐—ต๐—ถ๐—ป๐—ฒ ๐—Ÿ๐—ฒ๐—ฎ๐—ฟ๐—ป๐—ถ๐—ป๐—ด ๐˜‚๐˜€๐—ถ๐—ป๐—ด ๐—–๐—ผ๐—ป๐˜๐—ฎ๐—ถ๐—ป๐—ฒ๐—ฟ๐—ถ๐˜‡๐—ฎ๐˜๐—ถ๐—ผ๐—ป ๐—ง๐—ฒ๐—ฐ๐—ต๐—ป๐—ผ๐—น๐—ผ๐—ด๐˜†๐Ÿ’ป โšœ

Technology was not what it was many years ago. It has changed and is evolving continuously. ๐Ÿ’ป

In todayโ€™s world, we can say that Machine Learning is the most trending technology. Almost all the big companies are trying to automate things using Artificial Intelligence.

But the challenge faced by them is that the ML model that is created is somehow not getting deployed to the production world.

The percentage of AI models created but never put into production in large enterprises has been estimated to be as much as 90% or more.

With massive investments in data science teams, platforms, and infrastructure, the number of AI projects is dramatically increasing โ€” along with the number of missed opportunities. Unfortunately, most projects are not showing the value that business leaders expect and are introducing new risks that need to be managed.

โœจ The solution to the above problem statement is MLOps. โœจ

Simply put, MLOps is when DevOps is applied to machine learning. MLOps can be looked upon as the communication between data scientists and the production or operations team of an organization or business. Collaborative in nature, MLOps is designated for eliminating unnecessary functions or waste and automate processes as much as possible that helps in producing consistent and richer business or organizational insights using machine learning.

Source: Google

๐Ÿ“ The first thing we need for MLOps is โ€” ๐—–๐—ผ๐—ป๐˜๐—ฎ๐—ถ๐—ป๐—ฒ๐—ฟ๐—ถ๐˜‡๐—ฎ๐˜๐—ถ๐—ผ๐—ป ๐˜๐—ฒ๐—ฐ๐—ต๐—ป๐—ผ๐—น๐—ผ๐—ด๐˜†.

Containerization is defined as a form of operating system virtualization, through which applications are run in isolated user spaces called containers, all using the same shared operating system (OS).

For using containerization technology we have many tools like Docker, Podman, etc.

Now letโ€™s get into some hands-on practical implementing the topic. ๐Ÿ˜

Before that certain pre-requisites need to be taken care of. I will be using ๐—ฅ๐—ฒ๐—ฑ๐—›๐—ฎ๐˜ ๐—˜๐—ป๐˜๐—ฒ๐—ฟ๐—ฝ๐—ฟ๐—ถ๐˜€๐—ฒ ๐—Ÿ๐—ถ๐—ป๐˜‚๐˜… ( ๐—ฅ๐—›๐—˜๐—Ÿ ) ๐Ÿด for doing this practical.

โ— Your yum should be configured in the system. โ—

Now for installing the Docker community edition into the system create a repo file in the location /etc/yum.repos.d/ with the name docker-ce.repo

touch /etc/yum.repos.d/docker-ce.repo

๐Ÿ“ Note: The name of the repo file can be anything of your choice but the extension should be โ€œ.repoโ€ only.

Now open the repo file with any of your favorite editors and copy the below lines to configure docker.

docker-ce.repo file

Now your Docker should be configured in your system. โœจ

Run the following command to install docker.

yum install docker-ce --nobest -y

It will install the docker community edition software along with all the dependencies that it might require.

Here I will be using a Dockerfile to create the container image.The container image will have the machine learning code and the dataset.

The goal is when we run a container from this image then the ML code will create the model and then start the app for prediction.

โ— This point will be more clear when we go onto the practicals.

First create a directory containing all the required files.

My directory structure looks something as follows -

[root@localhost ML]# tree
.
โ”œโ”€โ”€ Dockerfile
โ”œโ”€โ”€ Salary_Data.csv
โ”œโ”€โ”€ salaryLR.py
โ””โ”€โ”€ salary_predictor.py
0 directories, 4 files
  • โ€œSalary_Data.csvโ€ is the dataset file.
  • โ€œsalaryLR.pyโ€ is the file that contains the Linear Regression code.
  • โ€œsalary_predictor.pyโ€ is the app that will be used by the user for predicting the salary.

Now letโ€™s come onto the Dockerfile.

FROM centos:latestMAINTAINER RAHUL SIL <rahul.official.150@gmail.com>RUN dnf install python3 ncurses net-tools -y && \
pip3 install numpy pandas joblib scikit-learn
WORKDIR /root/MLCOPY * /root/ML/RUN python3 salaryLR.py

CMD ["python3", "salary_predictor.py" ]

Lets build the dockerfile.

docker build -t linear_regression:v1.0  .

After the dockerfile gets executed you can confirm that a new linear_regression:v1.0 image has been created and stored in the local repository using the following command.

docker images

Now lets use the image to run and use the ML model.

docker container output

This way we have sucessfully implemented the task of running a Machine Learning model inside a docker container. โœจ๐Ÿ˜

I will soon come up with a blog where I will be explaining the basics of docker and the commnds of docker, dockerfile that I have used here. Stay tuned for that !!

I hope you liked this article.๐Ÿ’–

Would definitely like to hear your views on this and feedbacks so that I can improve on those points in future articles. ๐Ÿ™Œ Comment your views below.

You can also check my LinkedIn profile and connect with me.

Follow me on medium as I will come up with articles on various technologies like Cloud Computing, DevOps, Automation and their integration.

Thatโ€™s all for now. Thank You !! ๐Ÿ˜ŠโœŒ

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Rahul Sil

I am a tech enthusiasts. I love exploring new technologies and creating stuff out of them !! โœŒ