Tensorflow meets C# Azure function

Meet

Tensorflow meets C# Azure function and … . In this post I would like to show how to deploy tensorflow model with C# Azure function. I will use the TensorflowSharp the .NET bindings to the tensorflow library. The InterceptionInterface will be involved to create http endpoint which will recognize the images.

Code

I will start with creating .net core class library and adding TensorFlowSharp package:

dotnet new classlib
dotnet add package TensorFlowSharp -v 1.9.0

Then create file TensorflowImageClassification.cs:

Here I have defined the http entrypoint for the AzureFunction (Run method). The q query parameter is taken from the url and used as a url of the image which will be recognized.

The solution will analyze the image using the convolutional neural network arranged with the Interception architecture.

The function will automatically download the trained interception model thus the function first run will take little bit longer. The model will be saved to the D:\home\site\wwwroot\.

The convolutional neural network graph will be kept in the memory (graphCache) thus the function don’t have to read the model every request. On the other hand the input image tensor has to be prepared and preprocessed every time (ConstructGraphToNormalizeImage).

Finally I can run command:

dotnet publish

which will create the package for the function deployment.

Azure function

To deploy the code I will create the Azure Function (Consumption) with the http trigger. Additionally I will set the function entry point, the function.json will be defined as:

The kudu will be used to deploy the already prepared package. Additionally I have to deploy the libtensorflow.dll from /runtimes/win7-x64/native (otherwise the Azure Functions won’t load it). The bin directory should look like:

Finally I can test the azure function:

The function recognize the image and returns the label with the highest probability.

Hello from serverless messanger chatbot

Chatbot

Messanger chatbots are now becoming more and more popular. They can help us order pizzas, ask about the weather or check the news.

In this article, I would like to show you how to build a simple messanger chatbot in python and pass it on AWS lambda. Additionally use the wit.ai service to add to it the natural language understanding functionality and make it more intelligent.

To build the messanger chatbot I will need facebook app and facebook page.

Facebook page

The whole communication is going through a Facebook page thus I need to create it
https://www.facebook.com/bookmarks/pages

I will need the page id which you can find at the bottom of your page:
https://www.facebook.com/page_name/about

Facebook app

Then I create facebook app.
https://developers.facebook.com/quickstarts/?platform=web

Settings

I will copy the AppId and AppSecret which will be needed in the next steps:

Messanger product

Then I will add the messanger product and setup it.
I need to select page we already created and copy generated access token.

 

Webhook

Finally I have to setup the webhooks for messanger product
To finish this step I need to setup our chatbot on aws lambda.
I also have to provide the verify token which will be used to validate our endpoint.

AWS Lambda

Now I will prepare my chatbot endpoint. I will setup it on the AWS lambda.

Trigger

For my chatbot I need to configure API Gateway.
I have to choose security open otherwise I won’t be able to call it from messanger

 

Code

I also need to provide code which will handle the messanger webhook and send response.
I will simply put the code in the online editor.
Let’s take a look at the code:

Configuration

Bellow I have to setup environment variables:
verify_token – verification token (I use keepass to generate it) which we will use in webhook setup
access_token – value from the messanger webhook page setup
app_secret – facebook app secret

Now I’m ready to finish the webhook configuration:

I use api gateway url ass Callback URL and verify_token I have just generated.

Natural language undestanding

Messanger give easy way to add natural language undestanding functionality. To add this I simply configure it on messanger product setup page

Here I can choose already trained models but I will go further and I will create custom model.
Messanger will create the new wit.ai project for me.

On the wit.ai I can simply add some intents (like: hungry) and additional information which can be retrieved from the phrase (like: I want some pizza)

The messanger/wit integration is very smooth let’s analyze the webhook json I get when I put I want to eat pizza

After wit integration the nlp object was added. Now I can get the recognized intent with some confidence (like: hungry) and additional entities (like: dish).

Finally I can talk with my chatbot 🙂