Tensorflow meets C# Azure function

Meet

Tensorflow meets C# Azure function and … . In this post I would like to show how to deploy tensorflow model with C# Azure function. I will use the TensorflowSharp the .NET bindings to the tensorflow library. The InterceptionInterface will be involved to create http endpoint which will recognize the images.

Code

I will start with creating .net core class library and adding TensorFlowSharp package:

dotnet new classlib
dotnet add package TensorFlowSharp -v 1.9.0

Then create file TensorflowImageClassification.cs:

Here I have defined the http entrypoint for the AzureFunction (Run method). The q query parameter is taken from the url and used as a url of the image which will be recognized.

The solution will analyze the image using the convolutional neural network arranged with the Interception architecture.

The function will automatically download the trained interception model thus the function first run will take little bit longer. The model will be saved to the D:\home\site\wwwroot\.

The convolutional neural network graph will be kept in the memory (graphCache) thus the function don’t have to read the model every request. On the other hand the input image tensor has to be prepared and preprocessed every time (ConstructGraphToNormalizeImage).

Finally I can run command:

dotnet publish

which will create the package for the function deployment.

Azure function

To deploy the code I will create the Azure Function (Consumption) with the http trigger. Additionally I will set the function entry point, the function.json will be defined as:

The kudu will be used to deploy the already prepared package. Additionally I have to deploy the libtensorflow.dll from /runtimes/win7-x64/native (otherwise the Azure Functions won’t load it). The bin directory should look like:

Finally I can test the azure function:

The function recognize the image and returns the label with the highest probability.

Another brick in the … recommendation system – Databricks in action

Brick

Today I’d like to investigate the Databricks. I will show how it works and how to prepare simple recommendation system using collaborative filtering algorithm which can be used to help to match the product to the expectations and preferences of the user. Collaborative filtering algorithm is extremely useful when we know the relations (eg ratings) between products and the users but it is difficult to indicate the most significant features.

Databricks

First of all, I have to setup the databricks service where I can use Microsoft Azure Databricks or Databricks on AWS but the best way to start is to use the Community version.

Data

In this example I use the movielens small dataset to create recommendations for the movies. After unzipping the package I use the ratings.csv file.
On the main page of Databrick click Upload Data and put the file.

The file will be located on the DBFS (Databrick file system) and will have a path /FileStore/tables/ratings.csv. Now I can start model training.

Notebook

The data is ready thus in the next step I can create the databricks notebook (new notebook option on the databricks main page) similar to Jupyter notebook.

Using databricks I can prepare the recommendation in a few steps:

First of all I read and parse the data, because the data file contains the header additionally I have to cut it. In the next step I split the data into training which will be used to train the model and testing part for model evaluation.

I can simply create the ratings for each user/product pair but also export user and products (in this case movies) features. The features in general are meaningless factors but deeper analysis and intuition can give them meaning eg movie genre. The number of features is defined by the rank parameter in training method (used for model training).
The user/product rating is defined as a scalar product of user and product feature vectors.
This gives us ability to use them outside the databricks eg in relational database prefilter the movies using defined business rules and then order using user/product features.

Finally I have shown how to save the user and product features as a json and put it to Azure blob.

Hello from serverless messanger chatbot

Chatbot

Messanger chatbots are now becoming more and more popular. They can help us order pizzas, ask about the weather or check the news.

In this article, I would like to show you how to build a simple messanger chatbot in python and pass it on AWS lambda. Additionally use the wit.ai service to add to it the natural language understanding functionality and make it more intelligent.

To build the messanger chatbot I will need facebook app and facebook page.

Facebook page

The whole communication is going through a Facebook page thus I need to create it
https://www.facebook.com/bookmarks/pages

I will need the page id which you can find at the bottom of your page:
https://www.facebook.com/page_name/about

Facebook app

Then I create facebook app.
https://developers.facebook.com/quickstarts/?platform=web

Settings

I will copy the AppId and AppSecret which will be needed in the next steps:

Messanger product

Then I will add the messanger product and setup it.
I need to select page we already created and copy generated access token.

 

Webhook

Finally I have to setup the webhooks for messanger product
To finish this step I need to setup our chatbot on aws lambda.
I also have to provide the verify token which will be used to validate our endpoint.

AWS Lambda

Now I will prepare my chatbot endpoint. I will setup it on the AWS lambda.

Trigger

For my chatbot I need to configure API Gateway.
I have to choose security open otherwise I won’t be able to call it from messanger

 

Code

I also need to provide code which will handle the messanger webhook and send response.
I will simply put the code in the online editor.
Let’s take a look at the code:

Configuration

Bellow I have to setup environment variables:
verify_token – verification token (I use keepass to generate it) which we will use in webhook setup
access_token – value from the messanger webhook page setup
app_secret – facebook app secret

Now I’m ready to finish the webhook configuration:

I use api gateway url ass Callback URL and verify_token I have just generated.

Natural language undestanding

Messanger give easy way to add natural language undestanding functionality. To add this I simply configure it on messanger product setup page

Here I can choose already trained models but I will go further and I will create custom model.
Messanger will create the new wit.ai project for me.

On the wit.ai I can simply add some intents (like: hungry) and additional information which can be retrieved from the phrase (like: I want some pizza)

The messanger/wit integration is very smooth let’s analyze the webhook json I get when I put I want to eat pizza

After wit integration the nlp object was added. Now I can get the recognized intent with some confidence (like: hungry) and additional entities (like: dish).

Finally I can talk with my chatbot 🙂