Deploy Machine Learning Models in a Production environment as APIs (Python Flask + Visual Studio)

Intelligent application building basically consist of integrating machine learning based predictive components for the apps and systems. Mostly data scientists or the AI engineers are accountable of building these machine learning models.

When it comes to integration and deployment in production environment, the problem occurs with platform dependency. Most of the data scientists and AI engineers are pretty comfortable with python or R and they develop their models with them, though the rest of the system would be on .NET or Java based application.

One of the best approaches to connect these components together is deploying the ML predictive module as a web API and calling the API through the application. When it comes to APIs any programmer can work with it when they have the API definition.

Flask is a small and powerful web framework for Python. It’s easy to learn and simple to use, enabling you to build your web app in a short amount of time. Visual Studio provides an easy way to create Python flask web applications through it’s templates. Here’s the steps I’ve gone through for deploying the ML experiment as a REST API.

01. Create the machine learning model, train, tune and evaluate it.

Here what I’ve done is a simple linear regression for predicting the monthly salary according to the years of experience. Sci-kit learn python library has been used for performing the regression operation. The dataset used for the experiment is from SuperDataScience. 

The code is available in the GitHub repository .

02. Creating the pickle

When you deploy the predictive model in production environment, no need of training the model with code again and again. Python has a built-in method of persisting data called pickle. The pickle module can serialize objects or data into a file that we can save and load from. You can just use the pickle as a binary reference generating the output.  scikit-learn has their own model persistence method we will use: joblib. This is more efficient to use with scikit-learn models due to it being better at handling larger numpy arrays that may be stored in the models.

03. Create a Python Flask web application.

Simply go for Visual Studio. (I’m using VS2017 which comes with python by default) Select web project. The step by step guide is here.  I would recommend you to go with option 2 mentioned in the blog because it reduces lot of unnecessary overhead.

f_2For the safe side, use python virtual environments. It would avoid many hassles occurs with library dependencies. I’ve used anaconda environment as the base of virtual environment.

f_3

04. Create the API.

Create a new python file in your project and set it as the startup file. (In my case MLService.py is the startup file which contains the API code). The pickle file that contains the model binaries is the only dependency the API is getting when it is deployed.

f_7Here the API operates through POST methods which accepts the input in JSON.

04. Run & Test

You can run the API and test by sending POST requests to the URL with a JSON body. Here I’ve used postman to send a POST request and it gives me the predicted salary for the entered number of months.

f_5

You can access the whole code of the project through my GitHub repo here.

f_6

    Do comment if you have any suggestion to change the API structure.

Advertisements

Handling Big snakes on Visual Studio

In the last post we discuss on setting up a Windows rig for deep learning. If you still haven’t setup your machine, go do it first: D

After getting the so called big snakes; python and anaconda in the machine, we should have a proper IDE for coding.

There are many good IDEs you can use in Windows environment to code in python. Pycharm, Spyder are some popular tools.

If you familiar with Visual Studio, the so-called father of all IDEs, python works smoothly with VS. There are few configurations need to be done.

c1No need to purchase Visual Studio enterprise or ultimate. The freely available Visual Studio Community edition works fine. In 2017 version python comes along side with the default installation options. For the later versions you need to install Python Tools for Visual Studio (PTVS) separately.

https://docs.microsoft.com/en-us/visualstudio/python/python-in-visual-studio

Refer this guide for more details.

The python environments configured to machines can be seen from ‘Python Environments’ pane of Visual Studio. (If it is not there go for Tools -> Python -> Python Environments)

c2

By default, your Anaconda environment and default python environment should be there. First Refresh those environments to support intelliSense and grab the installed libraries for the DB.

For our deep learning experimentations, we configured a separate python environment before. To add that environment for visual studio follow the following steps.

01. Click Custom on ‘Python environments’

02. Go for anaconda environments and activate your pre-configured environment for deep learning (Mine is tensorflow-gpu)

c4

03. Copy the interpreter path of the environment

04. Insert it for the interpreter path and click “auto detect’. Visual Studio will detect the rest

c3

05. Click Apply

It may take few minutes to refresh the packages as well as the intelliSense. Make the configured environment your default and open the interactive. You are good to go 😊

Configuring a Windows Running Deep Learning Rig

When it comes to deep learning; the first thing comes to your mind is the “Computation Power”. The thousands of matrix operations that you going to perform when training the deep neural networks would take ages if you going to use only the CPU to do it.

The solution is the Graphical Processing Units (GPUs). introduction-to-multi-gpu-deep-learning-with-digits-2-mike-wang-22-638

There are few ways that you can get the power of high computation power for deep learning.

No offence, in my experience Linux operating system (What I’m using is the Ubuntu flavor) comes handy with performing deep learning operations in python because the terminal, bash commands, open source editing tools, GPU hackability is bit easy for me in Linux.

But the recent windows and Visual Studio updates too make it possible to do deep learning on your Windows rig.

Here are the steps I’ve followed to configure my laptop to perform some DL based computations with Tensorflow and Keras.

The laptop I’m using is an Asus UX310UA with Core i7 7th Gen processor, 16GB RAM and Nvidia Geforce 940MX 2 GB GPU.c2

I’m running Windows 10 Enterprise 1703 build on my laptop.

Please note that the following steps may change according to some conditions.

  1. Check the GPU processing capability of your GPU

If you wish to use your GPU for do parallel processing, first check the CUDA supportability of your GPU device. More the CUDA cores you have, more the computation you get. As an example, Nvidia Tesla K80 is having 4992 CUDA cores while Geforce 940MX equipped with 384 CUDA cores. The GPU compute capability should be 3.0 or higher.

Check whether your GPU is listed in the list.

https://www.geforce.com/hardware/technology/cuda

 

  1. Install CUDA Toolkit

Installing CUDA on Windows has a dependency for a C++ compiler. The CUDA version I’ve installed in my laptop is CUDA 8.0. Along with that I’ve installed Visual C++ 15.0 compiler. Refer the following guide to install CUDA Toolkit for your computer.

 http://docs.nvidia.com/cuda/cuda-installation-guide-microsoft-windows/index.html

 

  1. Install CuDNN Tools

For faster computations, you need to install CUDA Deep Neural Network toolkit. Depends on the CUDA version that you’ve installed you should select the appropriate CuDNN version. In my case with CUDA 8.0 Both CuDNN 7.0 & CuDNN 6.0 works. When it comes to package installations, CuDNN 7.0 throwed me some errors. So, I went with CuDNN 6.0 and it’s working fine on my machine 😊

Note that you need to do some manual file copy pastings in this step.

http://docs.nvidia.com/deeplearning/sdk/cudnn-install/index.html#install-windows

For safe side, restart the machine now! It’ll then pop up any additional dependencies that the GPU ask you to install.

 

  1. Install Anaconda

Now it’s time for the Big Snake! Anaconda is the leading Python data science platform. This framework comes with many pre-installed essential libraries and configurations that you may need regularly. Go with Python3 since it is the latest.

https://www.anaconda.com/download/

 

  1. Create a python environment for your experiments

Python comes with hell a lot of libraries that you may need to compile your program. So best thing is to create a separate environment for deep learning and use it. It’ll secure you from tangling the dependencies among libraries.

Go for Anaconda prompt (Find it on start menu – Advised to open the conda prompt as administrator) and push the command. We are using python 3.5 at the moment. ‘tensorflow-gpu’ is the environment name.

conda create -n tensorflow-gpu python=3.5 anaconda

Activate the environment

activate thensorflow-gpu

c1

  1. Install Theano

Theano is a Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. We need it! Make sure you are installing all of these inside your environment.

conda install theano

 

  1. Install mingw python

Even though python is an interpreted language, you may ned to install Windows C++ compilers in some cases. For python 3.5/3.6 you can use Visual C++ 14.0 compiler.

conda install mingw libpython

 

  1. Install tensorflow

Tensorflow is an open source library for numerical computation. You can install the cpu version if you don’t have a GPU in your machine just by installing the CPU version.

pip install tensorflow-gpu

 

  1. Install keras

Keras is a high-level neural network API. It can sun on top of TensorFlow, CNTK or Theano. For coding easiness will install Keras too.

conda install keras

 

  1. Update all the packages

conda update –all

All set! 😊 now you are ready to start coding. Start with your favorite IDE. For me, I prefer Spyder and sometimes Visual Studio. You can directly go for spyder from your Anaconda prompt or Anaconda navigator.  c3

Will discuss on dealing with python on Visual Studio in the next article.

Simple Linear Regression with Azure ML + Python

1419973816879Simple linear regression is a statistical method that allows us to summarize and study relationships between two continuous (quantitative) variables: One variable, denoted x, is regarded as the predictor, explanatory, or independent variable. The other variable, denoted y, is regarded as the response, outcome, or dependent variable.

Typically when we doing regression analysis, we consider about the correlation of coefficient of the input variables. Correlation analysis measures the extent to which two variables vary together, including the strength and direction of their relationship.

correlation_dot_graphsLinear correlation coefficient(also called Pearson product-moment correlation coefficient) measure of the strength and direction of a linear association between two random variables.

I used the Istanbul Stock Exchange dataset to demonstrate the steps in doing a simple linear regression prediction. Azure Machine Learning experiment has built (get the experiment from here) for building the regression model. Built-in Bayesian Linear Regression algorithm has been used for building the model.

capture1The most interesting part is coming with python! 🙂

I’ve used a Jupyter Notebook and fetched the data to that workspace to visualize the dataset and to calculate the coefficient values between each variable. Pearsonr method in scipy library has used for that.

Refer the iPython notebook from Azure Notebook for the complete python script and the visualizations.

https://notebooks.azure.com/library/Python%20Visualizations/html/Istanbul%20Stock%20Python%203%20notebook.ipynb

Do run the code by your own. You’ll get it for sure!

 

Natural Language Processing with Python + Visual Studio

cap_4Human Language is one of the most complicated phenomena to interpret for machines. Comparing to artificial languages like programming languages and mathematical notations, natural languages are hard to notate with explicit rules. Natural Language Processing, AKA Computational Linguistics enable computers to derive meaning from human or natural language input.

When it comes to natural language processing, text analysis plays a major role. One of the major problems we have to face when processing natural language is the computation power. Working with big corpus and chunking the textual data into n-grams need a big processing power. All mighty cloud; the ultimate savior comes handy in this application too.

Let’s peep into some of the cool tools you can use in your developments. In most of the cases, you don’t want to get the hassle of developing from the scratch. There are plenty of APIs and libraries that you can directly integrate with your system.

If you think, you wanna go from scratch and do some enhancements, there’s the space for you too. 😊

Text Analytics APIs

Microsoft text analytics APIs are set of web services built with Azure Machine Learning. Many major tasks found in natural language processing are exposed as web services through this. The API can be used to analyze unstructured text for tasks such as sentiment analysis, key phrase extraction, language detection and topic detection. No hard rules are training loads. Just call the API from your C# or python code. Refer the link below for more info.

https://docs.microsoft.com/en-us/azure/machine-learning/machine-learning-apps-text-analytics

Process natural language from the scratch!

Python! Yeah. that’s it! Among many languages used for programming, python comes handy with many pre-built packages specifically built for natural language processing.

Obviously, python works well with unix systems. But now the best IDE in town; Visual Studio comes with a toolset for python which enable you to edit, debug and compile python scripts using your existing IDE.  You should have Visual Studio 2015 (Community edition, professional or enterprise) for installing the python tools. (https://www.visualstudio.com/vs/python/)

Here I’ve used NLTK (Natural Language Tool Kit) for the task. One of the main advantage with NLTK is, it comes with dozens of built in corpora and trained models.

These are the Language processing tasks and corresponding NLTK modules with examples of functionality comes with that.

cap_1

Source – http://www.nltk.org/book/ch00.html

For running python NLTK for the first time you may need to download the nltk_data. Go for the python interactive console and install the required data from the popping up NLTK downloader. (Use nltk.download()  for this task)

cap_2

Here’s a little simulation of natural language processing tasks done using NLTK. Code snippets are commented for easy reading. 😊

import nltk
from nltk.corpus import treebank
from nltk.corpus import stopwords

#Sample sentence used for processing
sentence = """John came to office at eight o'clock on Monday morning & left the office with Arthur bit early."""

#Tokenizing the sentence into words 
word_tokens = nltk.word_tokenize(sentence)
#Tagging words
tagged_words = nltk.pos_tag(word_tokens)
#Identify named entities
named_entities = nltk.chunk.ne_chunk(tagged_words)

#Removing the stopwords from the text - Predefined stopwords in English have been used.
stop_words = set(stopwords.words('english'))
filtered_sentence = [w for w in word_tokens if not w in stop_words]

filtered_sentence = []

for w in word_tokens:
    if w not in stop_words:
        filtered_sentence.append(w)

print('Sentence - ' + sentence)
print('Word tokens - ')
print(word_tokens)
print('Tagged words - ')
print(tagged_words)
print('Named entities - ')
print(named_entities)
print('Word tokens - ')
print(word_tokens)
print('Filtered sentence - ')
print(filtered_sentence)

 

The output after executing the script should be like this.

cap_3You can Improve these basics to build Named Entity Recognizer and many more…

Try processing the language you read and speak… 😉

Jupyter Notebook on AzureML

plot_regression_3d_1 If you are fond of playing with data to dig out the relationships of it and to plot interesting visualizations with data; python is the language you should speak.

Over the years, with the strong community support, python language got dedicated libraries for data analysis and predictive modeling like scikit-learn, Tensorflow, Theano etc. Even the ultimate IDE in town; Visual Studio started supporting python! So, no hesitation. Python is a great choice to make.

You can use many IDEs or even a simple text editor to write your python files. But python comes with a handy web application; Jupyter notebook that can be used to do your code. Even compile it!

Jupyter gets its birth in 2014 as a spin-off project of IPython; which is a command shell for interactive computing in multiple programming languages, originally developed for the Python.

Why Jupyter?

Jupyter notebook is a very popular tool among data scientists which as a web application that allows you to create and share documents that contain live code, equations, visualizations and explanatory text. “Jupyter” is a loose acronym meaning Julia, Python and R. One of the most prominent uses you get when using Jupyter notebook is the ability of sharing the data transformation and visualization steps with your peers.

If you want to run Jupyter notebook in your local machine do refer the link below. With a few easy steps, you can have Jupyter notebook up and running in your machine.

http://jupyter.readthedocs.io/en/latest/install.html

One of the easiest ways to use Jupyter is running the notebook on Azure. No need to have python or the dependencies of it installed on your local machine. You can create, edit and share the Jupyter notes using Azure Machine Learning Studio. All the execution happens on the cloud.

Let’s get started!

1Access your notebook from “Notebooks” tab of AzureML Studio. When creating a new notebook, you can select which language and version you want to have in your notebook. Python 2, Python 3 and R are the supported languages right now.

Same as the Jupyter notebook running on the local machine, you get the same IPython interface on your browser.

2On the notebook menu bar, you can find out the ‘help’ menu which contains a brief user interface tour as well as a list of keyboard shortcuts that you can use to drive the notebook.

Here’s a little data mashup I’ve done using the famous ‘Iris dataset’ included in python sklearn. The .ipynb file is available on my github repo. Feel free to download and play with. A static html page created with the notebook output also included in the repo.

Azure is coming up with Azure Notebook preview feature. Here’s Iris visualization hosted on Azure Notebook

https://notebooks.azure.com/library/Python%20Visualizations/html/Iris+Data+Visualization.ipynb

No Machine learning algorithms or complex code snippets here. Just a data visualization & data transformation. 🙂