How to Deploy AI models? Part 7- Deploying Web-application on Heroku via Docker

How to Deploy AI models? Part 7- Deploying Web-application on Heroku via Docker

This Part is the continuation of the Deploying AI models Part-3 , where we deployed Iris classification model using Decision Tree Classifier. You can skip the training part if you have read the Part-3 of this series. In this article, we will use Flask as the front end to our web application to deploy the trained model for classification on Heroku platform with the help of docker.

Note: If you have followed my Model Deployement series from starting you can skip the section 1.

Article: Deploying AI models Part-3

1.Iris Model Web application using Flask.

1.1. Packages

The following packages were used to create the application.

1.1.1. Numpy

1.1.2. Flask, Request, render_template from flask

  1. 1.3. Dataset, Ensemble, Model Selection from sklearn

1.2. Dataset

The dataset is used to train the model is of iris dataset composed of 4 predictors and 3 target variable i.e. Classes

Predictors: Sepal Length, Sepal Width, Petal Length, Petal Width

Target : Setosa [0], Versicolor[1], Verginica[2]

Dataset Shape: 150 * 5

Figure 1. Dataset Overview

1.3. Model

For training the model we followed the following procedure:

Data Split up: 8:2 ie. 80% training set and 20% for the test set

Model: Ensemble- RandomForestClassifier with n_estimators=500)

Saving Model: Saved in the pickle file

Below is the code for the training the model.

import sklearnimport 
sklearn.datasetsimport sklearn.ensemble
import sklearn.model_selection
import pickle
import os
#load data
data = sklearn.datasets.load_iris()
#Split the data into test andtrain
train_data, test_data, train_labels, test_labels = sklearn.model_selection.train_test_split(data.data, data.target, train_size=0.80)
print(train_data,train_labels)
#Train a model using random forest
model=sklearn.ensemble.RandomForestClassifier(n_estimators=500)model.fit(train_data, train_labels)
#test the model
result = model.score(test_data, test_labels)
print(result)
#save the model 
filename = ‘iris_model.pkl’
pickle.dump(model, open(filename, ‘wb’))

1.4. Frontend using Flask

For feeding the value in the trained model we need some User Interface to accept the data from the user and feed into the trained neural network for classification. As we have seen in the sectioin 1.2 Dataset where we have 4 predictors and 3 classes to classify.

File name: index.html should be placed inside template folder.

<h1>Sample IRIS Classifier Model Application</h1>
<form action=””method=”post”>
<input type=”text” name=”sepal_length” placeholder=”Sepal Length” required=”required” />
<input type=”text” name=”sepal_width” placeholder=”Sepal Width” required=”required” />
<br>
<br>
<input type=”text” name=”petal_length” placeholder=”Petal Length” required=”required” />
<input type=”text” name=”petal_width” placeholder=”Petal Width” required=”required” />
<br>
<br>

<br>
<button type=”submit” class=”btn btn-primary btn-block btn-large”>Submit Data</button> </form>
<br>
<br>

The above shown code is for taking the input from the user and display it in the same page. For this we have used action=”” this will call the prediction fubtion when we will submit the data with the help of this form and to render the predicted output in the same page after prediction.

Big Data Jobs

File name: app.py

import numpy as np
from flask import Flask, request, jsonify, render_template
import pickle
import os
#app name
app = Flask(__name__)
#load the saved model
def load_model(): 
return pickle.load(open(‘iris_model.pkl’, ‘rb’)) #home
page@app.route(‘/’)
def home(): 
return render_template(‘index.html’)
app.route(‘/predict’,methods=[‘POST’])
def predict():
‘’’ For rendering results on HTML GUI ‘’’                                  
    labels = [‘setosa’, ‘versicolor’, ‘virginica’] 

features = [float(x) for x in request.form.values()]

values = [np.array(features)] model = load_model()
prediction = model.predict(values)
result = labels[prediction[0]]
return render_template(‘index.html’, output=’The Flower is
{}’.format(result))
if __name__ == “__main__”: 
port=int(os.environ.get(‘PORT’,5000))
app.run(port=port,debug=True,use_reloader=False)

In the python script, we called the index.html page in the home() and loaded the pickle file in load_model () function.

As mention above we will be using the same index.html for user input and for rendering the result. when we will submit the form via post method the data will be send to the predict() via action=”” and predict function from the app.py file and it will be processed and the trained model which we have loaded via load_model () function will predict and it will be mapped the respective class name accordingly.

To display the data we will render the same template i.e. index.html. If you would have remember we used keyword in the index.html page we will be sending the value in this field after prediction by rendering in the index.html page by the following Flask function.

render_template(‘index.html’, output=’The Flower is  
{}’.format(result))

where index.html is the template name and output=’The Flower is
{}’.format(result)
is the value to be rendered after prediction.

1.5. Extracting Packages and their respective versions

We need to create the require.txt file which contains the name of package we used in our application along with their respective version. The process of extracting the requirement.txt file is explained in the Article: Deep Learning/Machine Learning Libraries — An overview.

For this application below is the requirement.txt file content.

Flask==1.1.2 
joblib==1.0.0
numpy==1.19.3
scikit-learn==0.23.2
scipy==1.5.4
sklearn==0.0

Trending AI Articles:

1. Why Corporate AI projects fail?

2. How AI Will Power the Next Wave of Healthcare Innovation?

3. Machine Learning by Using Regression Model

4. Top Data Science Platforms in 2021 Other than Kaggle

2. Docker

Docker is an apparatus intended to make it simpler to make, convey, and show applications to utilizing holders. Holders permit a designer to bundle up an application with the entirety of the parts it needs, like libraries and different conditions, and convey it as one bundle. Thusly, on account of the compartment, the engineer can have confidence that the application will run on some other Linux machine paying little mind to any modified settings that machine may have that could vary from the machine utilized for composing and testing the code.

Docker is a cycle like a virtual machine. However, in contrast to a virtual machine, instead of making an entire virtual working framework, Docker permits applications to utilize a similar Linux bit as the framework that they’re running on and just requires applications be transported with things not previously running on the host PC. This gives a huge presentation help and decreases the size of the application. Also, significantly, Docker is open source. This implies that anybody can add to Docker and extend it to address their own issues on the off chance that they need extra highlights that aren’t accessible out of the crate.

Here, we will write the docker file the Dockerfile which will build the docker image and we can ship it anywhere irrespective of the unser enviroment. Here we will create the Docker file and I will explain the file line by line below.

FROM ubuntu:latest #line 1
RUN apt-get update -y #line 2
RUN apt-get install -y python-pip python-dev build-essential #line 3
COPY . /app #line 4
WORKDIR /app #line 5
RUN pip install -r requirements.txt #line 6
ENTRYPOINT [“python”] #line 7
CMD [“streamlit run app.py”] #line 8

Now, above is the code for the Dockerfile where,

Line 1: States we will use latest version of the ubuntu as our base image. Since docker was initially built on linux OS so i have used Linux as base image thought this is optional you can directly install python and build the container, but it is recommended to have base image.

Line 2: This line will update the Linux OS system.

Line 3: We are installing python

Line 4: We are copying the folder where we have kept our script and supporting materials for the application inside docker enviroment.

Line 5: Setting our working directory as app, where all the scripts are present inside docker.

Line 6: We are installing all the python packages required for the web application. Check the below article to find how we can extract the installed packages in python.

Article: Deep Learning/Machine Learning Libraries — An overview

Line 7 and Line 8: States that we will be using the Python applicarion to run the script with the command to run the script.

Building Docker image:

command: docker build -t streamlit-heroku:latest

The above line will create the image of the contained as mentioned in the Dockerfile.

Running Container:

The below command will execute the container but docker has its own Ip address and as well as the port just like any other OS, so we need to map the docker container port with our OS system so we can access the web application, which can be done by adding -p 8501:8501 or you can add Expose 8501 in Dockerfile it will automatically expose the mentioned port.

command: docker run -d -p 8501:8501 streamlit-heroku:latest

3. Heroku

It is a PaaS platform which supports many programming languages. Initially in 2007 it was supporting only Ruby programming language but not it supports many programming language such as Java, Node.js, Scala, Clojure, Python, PHP, and Go. It is also known as polyglot platform as it features for a developer to build, run and scale applications in a simillar manner across most of the language it was acquired by Salesforce.com in 2010.

pplications that are run on Heroku typically have a unique domain used to route HTTP requests to the correct application container or dyno. Each of the dynos are spread across a “dyno grid” which consists of several servers. Heroku’s Git server handles application repository pushes from permitted users. All Heroku services are hosted on Amazon’s EC2 cloud-computing platform.

You can register on this link and can host upto 5 application with student account.

If you have gone throught the Part 5 of Model Deployement series then it will be easy for you, but it is always recommended to checkout the below link.

Article: How to Deploy AI model? Part-5 Heroku CLI

Commands to deploy the container on heroku using CLI:

heroku container:login
heroku create dockerapp
heroku container:push web — app streamlit-heroku:latest

You will get the below messages as response which says that the Docker image has been built, tagged and successfully pushed. We are almost there with completing our deployment. The container is pushed but not released yet. I’m not exactly sure what could be the reason to have it in pushed stage before releasing. Anyways, the below command would release the container.

heroku container:release web — app streamlit-heroku:latest

Once it is released, you would get the message as done. Now it is time to check out our awesome app running on Heroku.

Congratulation!!! we have completed our series of Model deployement.

Special Thanks:

As we say “Car is useless if it doesn’t have a good engine” similarly student is useless without proper guidance and motivation. I will like to thank my Guru as well as my Idol “Dr. P. Supraja”- guided me throughout the journey, from the bottom of my heart. As a Guru, she has lighted the best available path for me, motivated me whenever I encountered failure or roadblock- without her support and motivation this was an impossible task for me.

Reference:

Extract installed packages and version : Article Link.

Deployed Application: Link

Github Documentation: Link

Notebook Link Extract installed packages and version : Notebook Link

YouTube : Link

If you have any query feel free to contact me on any of the below-mentioned options:

Website: www.rstiwari.com

Github page: Link

Medium: https://tiwari11-rst.medium.com

Google Form: https://forms.gle/mhDYQKQJKtAKP78V7

Don’t forget to give us your ? !


How to Deploy AI models ? Part 7- Deploying Web-application on Heroku via Docker was originally published in Becoming Human: Artificial Intelligence Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.

Via https://becominghuman.ai/how-to-deploy-ai-models-part-7-deploying-web-application-on-heroku-via-docker-1162b67b501c?source=rss—-5e5bef33608a—4

source https://365datascience.weebly.com/the-best-data-science-blog-2020/how-to-deploy-ai-models-part-7-deploying-web-application-on-heroku-via-docker

Published by 365Data Science

365 Data Science is an online educational career website that offers the incredible opportunity to find your way into the data science world no matter your previous knowledge and experience. We have prepared numerous courses that suit the needs of aspiring BI analysts, Data analysts and Data scientists. We at 365 Data Science are committed educators who believe that curiosity should not be hindered by inability to access good learning resources. This is why we focus all our efforts on creating high-quality educational content which anyone can access online.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Design a site like this with WordPress.com
Get started