Category Archives: Azure Functions

Add HttpClientFactory to an Azure Function

Since I first wrote about dependency injection in Azure Functions things have moved on a bit. These days, the Azure Functions natively* support DI. In this post, I’ll cover, probably the most common, DI scenario: adding HttpClientFactory to your project.

I’ll assume that you have an Azure function, and that it looks something like this:

    public static class Function1
    {
        [FunctionName("Function1")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
            ILogger log)
        {
            log.LogInformation("C# HTTP trigger function processed a request.");

            string name = req.Query["name"];

            string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
            dynamic data = JsonConvert.DeserializeObject(requestBody);
            name = name ?? data?.name;

            return name != null
                ? (ActionResult)new OkObjectResult($"Hello, {name}")
                : new BadRequestObjectResult("Please pass a name on the query string or in the request body");
        }
    }

I’ll assume that, because that’s the default HTTP Trigger function you get when you select to create a new function.

NuGet Packages

You’ll need a few NuGet packages; first, you’ll need:

Install-Package Microsoft.Extensions.Http

Which will allow you to use the HttpClientFactory. You’ll also need some packages for the DI:

Install-Package Microsoft.Azure.Functions.Extensions
Install-Package Microsoft.NET.Sdk.Functions

Startup

If you create a new MVC project, you get a Startup class, which manages all your DI, etc. So we’re going to create one here. Create a Startup.cs class in the function app:

    public class Startup : FunctionsStartup
    {
        public override void Configure(IFunctionsHostBuilder builder)
        {
            builder.Services.AddHttpClient();
        }
    }

The Configure method is a member of FunctionsStartup (ctrl-. to add the override). You’ll also need to add the following line outside of the namespace:

[assembly: FunctionsStartup(typeof(FullNamespace.Startup))]

Essentially, FullNamepsace here refers to the fully qualified Startup class in your project. Without this line, nothing will be added to the IoC container.

The AddHttpClient call inside Configure adds HttpClientFactory to your IoC container.

Azure Function

If you have a look at the code above, you’ll notice the class is static. We can’t have constructor injection into a static class (because we can’t have a constructor); let’s change that to an instance class:

        public Function1(IHttpClientFactory httpClientFactory)
        {
            _httpClientFactory = httpClientFactory;
        }

You’ll also need to change the function itself from static to instance:

        public async Task<IActionResult> Run(

That’s it – you can now reference HttpClientFactory from inside your function.

Notes

* Maybe not exactly ‘natively’

References

https://docs.microsoft.com/en-us/azure/azure-functions/functions-dotnet-dependency-injection

Calling an Azure Signalr Instance from an Azure function

I’ve been playing around with the Azure Signalr Service. I’m particularly interested in how you can bind to this from an Azure function. Imagine the following scenario:

You’re sat there on your web application, and I press a button on my console application and you suddenly get notified. It’s actually remarkably easy to set-up (although there are definitely a few little things that can trip you up – many thanks to Anthony Chu for his help with some of those!)

If you want to see the code for this, it’s here.

Create an Azure Signalr Service

Let’s start by setting up an Azure Signalr service:

You’ll need to configure a few things:

The pricing tier is your call, but obviously, free is less money than … well, not free! The region should be wherever you plan to deploy your function / app service to, although I won’t actually deploy either of those in this post, and the ServiceMode should be Serverless.

Once you’ve created that, make a note of the connection string (accessed from Keys).

Create a Web App

Follow this post to create a basic web application. You’ll need to change the startup.cs as follows:

        public void ConfigureServices(IServiceCollection services)
        {
            services.AddSignalR().AddAzureSignalR();
        }

        // This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
        public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
        {
            if (env.IsDevelopment())
            {
                app.UseDeveloperExceptionPage();
            }

            //app.UseDefaultFiles();
            //app.UseStaticFiles();

            app.UseFileServer();
            app.UseRouting();
            app.UseAuthorization();

            app.UseEndpoints(routes =>
            {
                routes.MapHub<InfoRelay>("/InfoRelay");
            });
        }

Next, we’ll need to change index.html:

<!DOCTYPE html>
<html>
<head>
    <meta charset="utf-8" />
    <title></title>
    
    <script src="lib/@microsoft/signalr/dist/browser/signalr.js"></script>
    <script src="getmessages.js" type="text/javascript"></script>
    <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.4.1/css/bootstrap.min.css">

</head>
<body>
    <div class="container">
        <div class="row">
            <div class="col-2">
                <h1><span class="label label-default">Message</span></h1>
            </div>
            <div class="col-4">
                <h1><span id="messageInput" class="label label-default"></span></h1>
            </div>
        </div>
        <div class="row">&nbsp;</div>
    </div>
    <div class="row">
        <div class="col-12">
            <hr />
        </div>
    </div>
</body>

</html>

The signalr package that’s referenced is an npm package:

npm install @microsoft/signalr

Next, we need the getmessages.js:

function bindConnectionMessage(connection) {
    var messageCallback = function (name, message) {
        if (!message) return;

        console.log("message received:" + message.Value);

        const msg = document.getElementById("messageInput");
        msg.textContent = message.Value;
    };
    // Create a function that the hub can call to broadcast messages.
    connection.on('broadcastMessage', messageCallback);
    connection.on('echo', messageCallback);
    connection.on('receive', messageCallback);
}

function onConnected(connection) {
    console.log("onConnected called");
}

var connection = new signalR.HubConnectionBuilder()
    .withUrl('/InfoRelay')
    .withAutomaticReconnect()
    .configureLogging(signalR.LogLevel.Debug)
    .build();

bindConnectionMessage(connection);
connection.start()
    .then(function () {
        onConnected(connection);
    })
    .catch(function (error) {
        console.error(error.message);
    });

The automatic reconnect and logging are optional (although at least while you’re writing this, I would strongly recommend the logging).

Functions App

Oddly, this is the simplest of all:

    public static class Function1
    {       
        [FunctionName("messages")]
        public static Task SendMessage(
            [HttpTrigger(AuthorizationLevel.Anonymous, "post")] object message,
            [SignalR(HubName = "InfoRelay")] IAsyncCollector<SignalRMessage> signalRMessages)
        {
            return signalRMessages.AddAsync(
                new SignalRMessage
                {
                    Target = "broadcastMessage",
                    Arguments = new[] { "test", message }
                });
        }
    }

The big thing here is the binding – SignalRMessage binding allows it to return the message to the hub (specified in HubName). Also, pay attention to the Target – this needs to match up the the event that the JS code is listening for (in this case: “broadcastMessage”).

Console App

Finally, we can send the initial message to set the whole chain off – the console app code looks like this:

        static async Task Main(string[] args)
        {
            Console.WriteLine($"Press any key to send a message");
            Console.ReadLine();

            HttpClient client = new HttpClient();
            string url = "http://localhost:7071/api/messages";
            
            HttpContent content = new StringContent("{'Value': 'Hello'}", Encoding.UTF8, "application/json");

            HttpResponseMessage response = await client.PostAsync(url, content);
            string results = await response.Content.ReadAsStringAsync();

            Console.WriteLine($"results: {results}");
            Console.ReadLine();
        }

So, all we’re doing here is invoking the function.

Now when you run this (remember that you’ll need to run all three projects), press enter in the console app, and you should see the “Hello” message pop up on the web app.

References

https://docs.microsoft.com/en-us/aspnet/core/signalr/javascript-client?view=aspnetcore-3.1

https://docs.microsoft.com/en-us/aspnet/core/signalr/dotnet-client?view=aspnetcore-3.1&tabs=visual-studio

https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-signalr-service?tabs=csharp

Creating a Car Game in React – Part 6 – Adding High Scores

This is the sixth post of a series that starts here.

As with previous posts, if you wish to download the code, it’s here; and, as with previous posts, I won’t cover all the code changes here, so if you’re interested, then you should download the code.

In this post, we’re going to create a High Score table. We’ll create an Azure function as the server, and we’ll store the scores themselves in Azure Tables.

Let’s start with the table.

Create a new storage account in Azure, then add an Azure Table to it:

You’ll see a sign trying to persuade you to use Cosmos DB here. At the time of writing, using Cosmos was considerably more expensive than Table Storage. Obviously, you get increased throughput, distributed storage, etc with Cosmos. For this, we don’t need any of that.

Create a new table:

An Azure table is, in fact, a No SQL offering, as you have a key, and then an attribute – the attribute can be a JSON file, or whatever you choose. In our case, we’ll set the key as the user name, and the score as the attribute.

Once you’re created your table storage, you may wish to use the Storage Explorer to create the tables, although that isn’t necessary.

Finally, you’ll need to add a CORS rule:

Obviously, this should actually point to the domain that you’re using, rather than a blanket ‘allow’, but it’ll do for testing.

Adding a username

Before we can store a high score, the user needs a username. Let’s add one first.

In game status, we’ll add a text box:

<div style={containerStyle}>
	<input type='text' value={props.Username}
	onChange={props.onChangeUsername} />
</div>

The state is raised to the main Game.jsx:

<GameStatus Lives={this.state.playerLives} 
	Message={this.state.message} 
	Score={this.state.score} 
	RemainingTime={this.state.remainingTime}
	Level={this.state.level}
	Username={this.state.username} 
	onChangeUsername={this.onChangeUsername.bind(this)} 
/>

And onChangeUsername is here:

onChangeUsername(e) {
	this.updateUserName(e.target.value);
}

updateUserName(newUserName) {
	this.setState({
		username: newUserName
	});
}

Update High Score

We’ll create an Azure Function to update the table. In Visual Studio, create a new Windows Azure Function App (you will need to install the Azure Workload if you haven’t already):

You’ll be asked what the trigger should be for the function: we’ll go with HttpTrigger. This allows us to call our function whenever we please (rather than the function, being say scheduled.) Next, we’ll need to install a NuGet package into our project to let us use the Azure Storage Client:

Install-Package WindowsAzure.Storage

We need some access details from Azure:

Creating the Functions

We’re actually going to need two functions: update and retrieve (we won’t be using the retrieve in this post, but we’ll create it anyway). Let’s start with a helper method:

    public static class StorageAccountHelper
    {
        public static CloudStorageAccount Connect()
        {
            string accountName = Environment.GetEnvironmentVariable("StorageAccountName");
            string accountKey = Environment.GetEnvironmentVariable("StorageAccountKey");

            var storageAccount = new CloudStorageAccount(
                new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(
                    accountName, accountKey), true);
            return storageAccount;
        }
    }

For testing purposes, add the account name and key into the local.settings.json:

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "FUNCTIONS_WORKER_RUNTIME": "dotnet",
    "StorageAccountName": "pcmtest2",
    "StorageAccountKey": "C05h2SJNQOXE9xYRObGP5sMi2owfDy7EkaouClfeOSKRdijyTQPh1PIJgHS//kOJPK+Nl9v/9BlH4rleJ4UJ7A=="
  }
}

The values here are taken from above – where we copied the access keys from Azure (whilst these keys are genuine keys, they will be changed by the time the post is published – so don’t get any ideas!

First, let’s create a function to add a new high Score:

        [FunctionName("AddHighScores")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
            ILogger log)
        {
            log.LogInformation("C# HTTP trigger function processed a request.");

            var newScore = new HighScore(req.Query["name"], int.Parse(req.Query["score"]));            

            var storageAccount = StorageAccountHelper.Connect();

            CloudTableClient client = storageAccount.CreateCloudTableClient();
            var table = client.GetTableReference("HighScore");

            await table.ExecuteAsync(TableOperation.InsertOrReplace(newScore));

            return new OkResult();
        }

If you’ve seen the default example of this function, it’s actually not that different: it’s a POST method, we take the name and score parameters from the query string, build up a record and add the score. The function isn’t perfect: any conflicting names will result in overwritten score, but this is a copy of a spectrum game – so maybe that’s authentic!

The second function is to read them:

        [FunctionName("GetHighScores")]
        public static async Task<IList<HighScore>> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", Route = null)] HttpRequest req,
            ILogger log)
        {
            log.LogInformation("C# HTTP trigger function processed a request.");

            var storageAccount = StorageAccountHelper.Connect();

            CloudTableClient client = storageAccount.CreateCloudTableClient();
            var table = client.GetTableReference("HighScore");
            var tq = new TableQuery<HighScore>();
            var continuationToken = new TableContinuationToken();
            var result = await table.ExecuteQuerySegmentedAsync(tq, continuationToken);
            
            return result.Results;
        }

All we’re really doing here is reading whatever’s in the table. This might not scale hugely well, but again, for testing, it’s fine. The one thing to note here is ExecuteQuerySegmentedAsync: there seems to be very little documentation around on it; and what there is seems to refer to ExecuteQueryAsync (which, as far as I can tell, doesn’t, or at least, no longer, exists).

Let’s run the Azure function locally and see what happens:

As you can see, Azure helpfully gives us some endpoints that we can use for testing. If you don’t have a copy already, then download Postman. Here you can create a request that calls the function.

I won’t go into the exact details of how Postman works, but the requests might look something like this:

http://localhost:7071/api/AddHighScores?name=test2&score=19
http://localhost:7071/api/GetHighScores?10

To prove to yourself that they are actually working, have a look in the table.

There is now an online Storage Explorer in the Azure Portal. Details of the desktop version can be found in this post.

Update High Score from the Application

Starting with adding the high score, let’s call the method to add the high score when the player dies (as that’s the only time we know what the final score is):

playerDies() { 
    this.setState({
        playerLives: this.state.playerLives - 1,
        gameLoopActive: false
    });

    if (this.state.playerLives <= 0) {
        this.updateHighScore();
        this.initiateNewGame();
    } else {
        this.startLevel(this.state.level);
    }

    this.repositionPlayer();
    this.setState({ 
        playerCrashed: false,
        gameLoopActive: true
    });
}

The updateHighScore function looks like this:

updateHighScore() {
	fetch('http://localhost:7071/api/AddHighScores?name=' + this.state.username + '&score=' + this.state.score, {
		method: 'POST'
	}); 
}

Note (obviously) that here I’m updating using my locally running instance of the Azure Function.

And that’s it – we now have a score updating when the player dies. Next we need to display the high scores – that’ll be the next post.

References

https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch

https://facebook.github.io/react-native/docs/network

What can you do with a logic app? Part Two – Use Excel to Manage an E-mail Notification System

In this post I started a series of posts covering different scenarios that you might use an Azure Logic App, and how you might go about that. In this, the second post, we’re going to set-up an excel spreadsheet that allows you simply add a row to an excel table and have a logic app act on that row.

So, we’ll set-up a basic spreadsheet with an e-mail address, subject, text and a date we want it to send; then we’ll have the logic app send the next eligible mail in the list, and mark it as sent.

Spreadsheet

I’ll first state that I do not have an Office 365 subscription, and nothing that I do here will require one. We’ll create the spreadsheet in Office Online. Head over to One Drive (if you don’t have a one drive account then they are free) and create a new spreadsheet:

In the spreadsheet, create a new table – just enter some headers (like below) and then highlight the columns and “Insert Table”:

Remember to check “My Table Has Headers”.

Now enter some data:

Create the Logic App

In this post I showed how you can use Visual Studio to create and deploy a logic app; we’ll do that here:

Once we’ve created the logic app, we’ll need to select to create an action that will get the Excel file that we created; in this case “List rows present in a table”:

This also requires that we specify the table (if you’re using the free online version of Excel then you’ll have to live with the table name you’re given):

Loop

This retrieves a list of rows, and so the next step is to iterate through them one-by-one. We’ll use a For-Each:

Conditions

Okay, so we’re now looking at every row in the table, but we don’t want every row in the table, we only want the ones that have not already been sent, and the ones that are due to be sent (so the date is either today, or earlier). We can use a conditional statement for this:

But we have two problems:

  • Azure Logic Apps are very bad at handling dates – that is to say, they don’t
  • There is currently no way in an Azure Logic App to update an Excel spreadsheet row (you can add and delete only)

The former is easily solved, and the way I elected to solve the latter is to simply delete the row instead of updating it. It is possible to simply delete the current row, and add it back with new values; however, we won’t bother with that here.

Back to the date problem; what we need here is an Azure function…

Creating an Azure Function

Here is the code for our function (see here for details of how to create one):

        [FunctionName("DatesCompare")]
        public static IActionResult Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequest req, TraceWriter log)
        {
            log.Info("C# HTTP trigger function processed a request.");

            string requestBody = new StreamReader(req.Body).ReadToEnd();
            return ParseDates(requestBody);

        }

        public static IActionResult ParseDates(string requestBody)
        {
            dynamic data = JsonConvert.DeserializeObject(requestBody);

            DateTime date1 = (DateTime)data.date1;
            DateTime date2 = DateTime.FromOADate((double)data.date2);

            int returnFlagIndicator = 0;
            if (date1 > date2)
            {
                returnFlagIndicator = 1;
            }
            else if (date1 < date2)
            {
                returnFlagIndicator = -1;
            }

            return (ActionResult)new OkObjectResult(new
            {
                returnFlag = returnFlagIndicator
            });
        }

There’s a few points to note about this code:
1. The date coming from Excel extracts as a double, which is why we need to use FromOADate.
2. The reason to split the function up is so that the main logic can be more easily unit tested. If you ever need a reason for unit testing then try to work out why an Azure function isn’t working inside a logic app!

The logic around this function looks like this:

We build up the request body with the information that we have, and then parse the output. Finally, we can check if the date is in the past and then send the e-mail:

Lastly, as we said earlier, we’ll delete the row to ensure that the e-mail is only sent once:

The eagle eyed and sane amongst you will notice that I’ve used the subject as a key. Don’t do this – it’s very bad practice!

References

https://github.com/Azure/azure-functions-host/wiki/Azure-Functions-runtime-2.0-known-issues

What can you do with a logic app? Part One – Send tweets at random intervals based on a defined data set

I thought I’d start another of my patented series’. This one is about finding interesting things that can be done with Azure Logic Apps.

Let’s say, for example, that you have something that you want to say; for example, if you were Richard Dawkins or Ricky Gervais, you might want to repeatedly tell everyone that there is no God; or if you were Google, you might want to tell everyone how .Net runs on your platform; or if you were Microsoft, you might want to tell people how it’s a “Different Microsoft” these days.

The thing that I want to repeatedly tell everyone is that I’ve written some blog posts. For this purpose, I’m going to set-up a logic app that, based on a random interval, sends a tweet from my account (https://twitter.com/paul_michaels), informing people of one of my posts. It will get this information from a simple Azure storage table; let’s start there: first, we’ll need a storage account:

Then a table:

We’ll enter some data using Storage Explorer:

After entering a few records (three in this case – because the train journey would need to be across Russia or something for me to fill my entire back catalogue in manually – I might come back and see about automatically scraping this data from WordPress one day).

In order to create our logic app, we need a singular piece of custom logic. As you might expect, there’s no randomised action or result, so we’ll have to create that as a function:

For Logic App integration, a Generic WebHook seems to work best:

Here’s the code:

#r "Newtonsoft.Json"
using System;
using System.Net;
using Newtonsoft.Json;
static Random _rnd;

public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info($"Webhook was triggered!");
    if (_rnd == null) _rnd = new Random();
    string rangeStr = req.GetQueryNameValuePairs()
        .FirstOrDefault(q => string.Compare(q.Key, "range", true) == 0)
        .Value;
    int range = int.Parse(rangeStr);
    int num = _rnd.Next(range - 1) + 1; 
    var response = req.CreateResponse(HttpStatusCode.OK, new
    {
        number = num
    });
    return response;
}

Okay – back to the logic app. Let’s create one:

The logic app that we want will be (currently) a recurrence; let’s start with every hour (if you’re following along then you might need to adjust this while testing – be careful, though, as it will send a tweet every second if you tell it to):

Add the function:

Add the input variables (remember that the parameters read by the function above are passed in via the query):

One thing to realise about Azure functions is they rely heavily on passing JSON around. For this purpose, you’ll use the JSON Parser action a lot. My advice would be to name them sensibly, and not “Parse JSON” and “Parse JSON 2” as I’ve done here:

The JSON Parser action requires a schema – that’s how it knows what your data looks like. You can get the schema by selecting the option to use a sample payload, and just grabbing the output from above (when you tested the function – if you didn’t test the function then you obviously trust me far more than you should and, as a reward, you can simply copy the output from below):

That will then generate a schema for you:

Note: if you get the schema wrong then the run report will error, but it will give you a dump of the JSON that it had – so another approach would be to enter anything and then take the actual JSON from the logs.

Now we’ll add a condition based on the output. Now that we’ve parsed the JSON, “number” (or output from the previous step) is available:

So, we’ll check if the number is 1 – meaning there’s a 1 in 10 chance that the condition will be true. We don’t care if it’s false, but if it’s true then we’ll send a tweet. Before we do that , though – we need to go the data table and find out what to send. Inside the “true” branch, we’ll add an “Azure Table Storage – Get Entities” call:

This asks you for a storage connection (the name is just for you to name the connection to the storage account). Typically, after getting this data, you would call for each to run through the entries. Because there is currently no way to count the entries in the table, we’ll iterate through each entry, but we’ll do it slowly, and we’ll use our random function to ensure that all are not sent.

Let’s start with not sending all items:

All the subsequent logic is inside the true branch. The next thing is to work out how long to delay:

Now we have a number between 1 and 60, we can wait for that length of time:

The next step is to send the tweet, but because we need specific data from the table, it’s back to our old friend: Parse JSON (it looks like every Workflow will contain around 50% of these parse tasks – although, obviously, you could bake this sort of thing into a function).

To get the data for the tweet, we’ll need to parse the JSON for the current item:

Once you’ve done this, you’ll have access to the parts of the record and can add the Tweet action:

And we have a successful run… and some tweets:

Reading Azure Service Bus Queue Names from the Config File

In this post, I wrote about how you might read a message from the service bus queue. However, with Azure Functions (and WebJobs), comes the ability to have Microsoft do some of this plumbing code for you.

I have a queue here (taken from the service bus explorer):

I can read this in an Azure function; let’s create a new Azure Functions App:

This time, we’ll create a Service Bus Queue Triggered function:

Out of the box, that will give you this:

public static class Function1
{
    [FunctionName("Function1")]
    public static void Run([ServiceBusTrigger("testqueue", AccessRights.Listen, Connection = "")]string myQueueItem, TraceWriter log)
    {
        log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
    }
}

There’s a few things that we’ll probably want to change here. The first is “Connection”. We can remove that parameter altogether, and then add a row to the local.settings.json file (which can be overridden later inside Azure). Out of the box, you get AzureWebJobsStorage and AzureWebJobsDashboard, which both accept a connection string to a Azure Storage Account. You can also add AzureWebJobsServiceBus, which accepts a connection string to the service bus:

"Values": {
    "AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=teststorage1…",
    "AzureWebJobsDashboard": "DefaultEndpointsProtocol=https;AccountName=teststorage1…",
    "AzureWebJobsServiceBus": "Endpoint=sb://pcm-servicebustest.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=…"
  }

If you run the job, it will now pick up any outstanding entries in that queue. But, what if you don’t know the queue name; for example, what if you find out the queue name is different. To illustrate the point; here, I’m looking for “testqueue1”, but the queue name (as you saw earlier) is “testqueue”:

public static class Function1
{
    [FunctionName("Function1")]
    public static void Run([ServiceBusTrigger("testqueue1", AccessRights.Listen)]string myQueueItem, TraceWriter log)
    {
        log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
    }
}

Obviously, if you’re looking for a queue that doesn’t exist, bad things happen:

To fix this, I have to change the code… which is broadly speaking a bad thing. What we can do, is configure the queue name in the config file; like this:

"Values": {
    "AzureWebJobsStorage": " . . . ",
    . . .,
    "queue-name":  "testqueue"
  }

And we can have the function look in the config file by changing the queue name:

[FunctionName("Function1")]
public static void Run([ServiceBusTrigger("%queue-name%", AccessRights.Listen)]string myQueueItem, TraceWriter log)
{
    log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
}

The pattern of supplying a variable name in the format “%variable-name%” seems to work across other triggers and bindings for Azure Functions.

Deployment

That’s now looking much better, but what happens when the function gets deployed? Let’s see:

We can now see that the function is deployed:

At the minute, it won’t do anything, because it’s looking for a queue name in a setting that only exists locally. Let’s fix that:

Remember to save the changes.

Looking at the logs confirms that this now runs correctly.

Creating a Scheduled Azure Function

I’ve previously written about creating Azure functions. I’ve also written about how to react to service bus queues. In this post, I wanted to cover creating a scheduled function. Basically, these allow you to create a scheduled task that executes at a given interval, or at a given time.

Timer Trigger

The first thing to do is create a function with a type of Timer Trigger:

Schedule / CRON format

The next thing is to understand the schedule, or CRON, format. The format is:

{second} {minute} {hour} {day} {month} {day-of-week}

Scheduled Intervals

The example you’ll see when you create this looks like this:

0 */5 * * * *

The notation */[number] means once every number; so */5 would mean once every 5… and then look at the placeholder to work out 5 what; in this case it means once every 5 minutes. So, for example:

*/10 * * * * *

Would be once every 10 seconds.

Scheduled Times

Specifying numbers means the schedule will execute at that time; so:

0 0 0 * * *

Would execute every time the hour, minute and second all hit zero – so once per day at midnight; and:

0 * * * * *

Would execute every time the second hits zero – so once per minute; and:

0 0 * * * 1

Would execute once per hour on a Monday (as the last placeholder is the day of the week).

Time constraints

These can be specified in any column in the format [lower bound]-[upper bound], and they restrict the timer to a range of values; for example:

0 */20 5-10 * * *

Means every 20 minutes between 5 and 10am (as you can see, the different types can be used in conjunction).

Asterisks (*)

You’ll notice above that there are asterisks in every placeholder that a value has not been specified. The askerisk signifies that the schedule will execute at every interval within that placeholder; for example:

* * * * * *

Means every second; and:

0 * * * * *

Means every minute.

Back to the function

Upon starting, the function will detail when the next several executions will take place:

But what if you don’t know what the schedule will be at compile time. As with many of the variables in an Azure Function, you can simply substitute the value for a placeholder:

[FunctionName("MyFunc")]
public static void Run([TimerTrigger("%schedule%")]TimerInfo myTimer, TraceWriter log)
{
    log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
}

This value can then be provided inside the local.settings.json:

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "DefaultEndpointsProto . . .",
    "AzureWebJobsDashboard": "DefaultEndpointsProto . . .",
    "schedule": "0 * * * * *"
  }
}

References

https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer

http://cronexpressiondescriptor.azurewebsites.net/?expression=1+*+*+*+*+*&locale=en

Using Unity With Azure Functions

Azure Functions are a relatively new kid on the block when it comes to the Microsoft Azure stack. I’ve previously written about them, and their limitations. One such limitation seems to be that they don’t lend themselves very well to using dependency injection. However, it is certainly not impossible to make them do so.

In this particular post, we’ll have a look at how you might use an IoC container (Unity in this case) in order to leverage DI inside an Azure function.

New Azure Functions Project

I’ve covered this before in previous posts, in Visual Studio, you can now create a new Azure Functions project:

That done, you should have a project that looks something like this:

As you can see, the elephant in the room here is there are no functions; let’s correct that:

Be sure to call your function something descriptive… like “Function1”. For the purposes of this post, it doesn’t matter what kind of function you create, but I’m going to create a “Generic Web Hook”.

Install Unity

The next step is to install Unity (at the time of writing):

Install-Package Unity -Version 5.5.6

Static Variables Inside Functions

It’s worth bearing mind that a static variable works the way you would expect, were the function a locally hosted process. That is, if you write a function such as this:

[FunctionName("Function1")]
public static object Run([HttpTrigger(WebHookType = "genericJson")]HttpRequestMessage req, TraceWriter log)
{
    log.Info($"Webhook was triggered!");
    
    System.Threading.Thread.Sleep(10000);
    log.Info($"Index is {test}");
    return req.CreateResponse(HttpStatusCode.OK, new
    {
        greeting = $"Hello {test++}!"
    });
}

And access it from a web browser, or postman, or both as the same time, you’ll get incrementing numbers:

Whilst the values are shared across the instances, you can’t cause a conflict by updating something in one function while reading it in another (I tried pretty hard to cause this to break). What this means, then, is that we can store an IoC container that will maintain state across function calls. Obviously, this is not intended for persisting state, so you should assume your state could be lost at any time (as indeed it can).

Registering the Unity Container

One method of doing this is to use the Lazy object. This pretty much passed me by in .Net 4 (which is, apparently, when it came out). It basically provides a slightly neater way of doing this kind of thing:

private List<string> _myList;
public List<string> MyList
{
    get
    {
        if (_myList == null)
        {
            _myList = new List<string>();
        }
        return _myList;
    }
}

The “lazy” method would be:

public Lazy<List<string>> MyList = new Lazy<List<string>>(() =>
{
    List<string> newList = new List<string>();
    return newList;
});

With that in mind, we can do something like this:

public static class Function1
{
     private static Lazy<IUnityContainer> _container =
         new Lazy<IUnityContainer>(() =>
         {
             IUnityContainer container = InitialiseUnityContainer();
             return container;
         });

InitialiseUnityContainer needs to return a new instance of the container:

public static IUnityContainer InitialiseUnityContainer()
{
    UnityContainer container = new UnityContainer();
    container.RegisterType<IMyClass1, MyClass1>();
    container.RegisterType<IMyClass2, MyClass2>();
    return container;
}

After that, you’ll need to resolve the parent dependency, then you can use standard constructor injection; for example, if MyClass1 orchestrates your functionality; you could use:

_container.Value.Resolve<IMyClass1>().DoStuff();

In Practise

Let’s apply all of that to our Functions App. Here’s two new classes:

public interface IMyClass1
{
    string GetOutput();
}
 
public interface IMyClass2
{
    void AddString(List<string> strings);
}
public class MyClass1 : IMyClass1
{
    private readonly IMyClass2 _myClass2;
 
    public MyClass1(IMyClass2 myClass2)
    {
        _myClass2 = myClass2;
    }
 
    public string GetOutput()
    {
        List<string> teststrings = new List<string>();
 
        for (int i = 0; i <= 10; i++)
        {
            _myClass2.AddString(teststrings);
        }
 
        return string.Join(",", teststrings);
    }
}
public class MyClass2 : IMyClass2
{
    public void AddString(List<string> strings)
    {
        Thread.Sleep(1000);
        strings.Add($"{DateTime.Now}");
    }
}

And the calling code looks like this:

[FunctionName("Function1")]
public static object Run([HttpTrigger(WebHookType = "genericJson")]HttpRequestMessage req, TraceWriter log)
{
    log.Info($"Webhook was triggered!");
 
    string output = _container.Value.Resolve<IMyClass1>().GetOutput();
    return req.CreateResponse(HttpStatusCode.OK, new
    {
        output
    });
}

Running it, we get an output that we might expect:

References

https://github.com/Azure/azure-webjobs-sdk/issues/1206

Working with Multiple Cloud Providers – Part 3 – Linking Azure and GCP

This is the third and final post in a short series on linking up Azure with GCP (for Christmas). In the first post, I set-up a basic Azure function that updated some data in table storage, and then in the second post, I configured the GCP link from PubSub into BigQuery.

In the post, we’ll square this off by adapting the Azure function to post a message directly to PubSub; then, we’ll call the Azure function with Santa’a data, and watch that appear in BigQuery. At least, that was my plan – but Microsoft had other ideas.

It turns out that Azure functions have a dependency on Newtonsoft Json 9.0.1, and the GCP client libraries require 10+. So instead of being a 10 minute job on Boxing day to link the two, it turned into a mammoth task. Obviously, I spent the first few hours searching for a way around this – surely other people have faced this, and there’s a redirect, setting, or way of banging the keyboard that makes it work? Turns out not.

The next idea was to experiment with contacting the Google server directly, as is described here. Unfortunately, you still need the Auth libraries.

Finally, I swapped out the function for a WebJob. WebJobs give you a little move flexibility, and have no hard dependencies. So, on with the show (albeit a little more involved than expected).

WebJob

In this post I described how to create a basic WebJob. Here, we’re going to do something similar. In our case, we’re going to listen for an Azure Service Bus Message, and then update the Azure Storage table (as described in the previous post), and call out to GCP to publish a message to PubSub.

Handling a Service Bus Message

We weren’t originally going to take this approach, but I found that WebJobs play much nicer with a Service Bus message, than with trying to get them to fire on a specific endpoint. In terms of scaleability, adding a queue in the middle can only be a good thing. We’ll square off the contactable endpoint at the end with a function that will simply convert the endpoint to a message on the queue. Here’s what the WebJob Program looks like:

public static void ProcessQueueMessage(
    [ServiceBusTrigger("localsantaqueue")] string message,
    TextWriter log,
    [Table("Delivery")] ICollector<TableItem> outputTable)
{
    Console.WriteLine("test");
 
    log.WriteLine(message);
 
    // parse query parameter
    TableItem item = Newtonsoft.Json.JsonConvert.DeserializeObject<TableItem>(message);
    if (string.IsNullOrWhiteSpace(item.PartitionKey)) item.PartitionKey = item.childName.First().ToString();
    if (string.IsNullOrWhiteSpace(item.RowKey)) item.RowKey = item.childName;
 
    outputTable.Add(item);
 
    GCPHelper.AddMessageToPubSub(item).GetAwaiter().GetResult();
    
    log.WriteLine("DeliveryComplete Finished");
 
}

Effectively, this is the same logic as the function (obviously, we now have the GCPHelper, and we’ll come to that in a minute. First, here’s the code for the TableItem model:

[JsonObject(MemberSerialization.OptIn)]
public class TableItem : TableEntity
{
    [JsonProperty]
    public string childName { get; set; }
 
    [JsonProperty]
    public string present { get; set; }
}

As you can see, we need to decorate the members with specific serialisation instructions. The reason being that this model is being used by both GCP (which only needs what you see on the screen) and Azure (which needs the inherited properties).

GCPHelper

As described here, you’ll need to install the client package for GCP into the Azure Function App that we created in post one of this series (referenced above):

Install-Package Google.Cloud.PubSub.V1 -Pre

Here’s the helper code that I mentioned:

public static class GCPHelper
{
    public static async Task AddMessageToPubSub(TableItem toSend)
    {
        string jsonMsg = Newtonsoft.Json.JsonConvert.SerializeObject(toSend);
        
        Environment.SetEnvironmentVariable(
            "GOOGLE_APPLICATION_CREDENTIALS",
            Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Test-Project-8d8d83hs4hd.json"));
        GrpcEnvironment.SetLogger(new ConsoleLogger());

        PublisherClient publisher = PublisherClient.Create();
        string projectId = "test-project-123456";
        TopicName topicName = new TopicName(projectId, "test");
        SimplePublisher simplePublisher = 
            await SimplePublisher.CreateAsync(topicName);
        string messageId = 
            await simplePublisher.PublishAsync(jsonMsg);
        await simplePublisher.ShutdownAsync(TimeSpan.FromSeconds(15));
    }
 
}

I detailed in this post how to create a credentials file; you’ll need to do that to allow the WebJob to be authorised. The Json file referenced above was created using that process.

Azure Config

You’ll need to create an Azure message queue (I’ve called mine localsantaqueue):

I would also download the Service Bus Explorer (I’ll be using it later for testing).

GCP Config

We already have a DataFlow, a PubSub Topic and a BigQuery Database, so GCP should require no further configuration; except to ensure the permissions are correct.

The Service Account user (which I give more details of here needs to have PubSub permissions. For now, we’ll make them an editor, although in this instance, they probably only need publish:

Test

We can do a quick test using the Service Bus Explorer and publish a message to the queue:

The ultimate test is that we can then see this in the BigQuery Table:

Lastly, the Function

This won’t be a completely function free post. The last step is to create a function that adds a message to the queue:

[FunctionName("Function1")]
public static HttpResponseMessage Run(
    [HttpTrigger(AuthorizationLevel.Function, "post")]HttpRequestMessage req,             
    TraceWriter log,
    [ServiceBus("localsantaqueue")] ICollector<string> queue)
{
    log.Info("C# HTTP trigger function processed a request.");
    var parameters = req.GetQueryNameValuePairs();
    string childName = parameters.First(a => a.Key == "childName").Value;
    string present = parameters.First(a => a.Key == "present").Value;
    string json = "{{ 'childName': '{childName}', 'present': '{present}' }} ";            
    queue.Add(json);
    

    return req.CreateResponse(HttpStatusCode.OK);
}

So now we have an endpoint for our imaginary Xamarin app to call into.

Summary

Both GCP and Azure are relatively immature platforms for this kind of interaction. The GCP client libraries seem to be missing functionality (and GCP is still heavily weighted away from .Net). The Azure libraries (especially functions) seem to be in a pickle, too – with strange dependencies that makes it very difficult to communicate outside of Azure. As a result, this task (which should have taken an hour or so) took a great deal of time, and it was completely unnecessary.

Having said that, it is clearly possible to link the two systems, if a little long-winded.

References

https://blog.falafel.com/rest-google-cloud-pubsub-with-oauth/

https://github.com/Azure/azure-functions-vs-build-sdk/issues/107

https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus

https://stackoverflow.com/questions/48092003/adding-to-a-queue-using-an-azure-function-in-c-sharp/48092276#48092276

Working with Multiple Cloud Providers – Part 1 – Azure Function

Regular readers (if there are such things to this blog) may have noticed that I’ve recently been writing a lot about two main cloud providers. I won’t link to all the articles, but if you’re interested, a quick search for either Azure or Google Cloud Platform will yield several results.

Since it’s Christmas, I thought I’d do something a bit different and try to combine them. This isn’t completely frivolous; both have advantages and disadvantages: GCP is very geared towards big data, whereas the Azure Service Fabric provides a lot of functionality that might fit well with a much smaller LOB app.

So, what if we had the following scenario:

Santa has to deliver presents to every child in the world in one night. Santa is only one man* and Google tells me there are 1.9B children in the world, so he contracts out a series of delivery drivers. There needs to be around 79M deliveries every hour, let’s assume that each delivery driver can work 24 hours**. Each driver can deliver, say 100 deliveries per hour, that means we need around 790,000 drivers. Every delivery driver has an app that links to their depot; recording deliveries, schedules, etc.

That would be a good app to write in, say, Xamarin, and maybe have an Azure service running it; here’s the obligatory box diagram:

The service might talk to the service bus, might control stock, send e-mails, all kinds of LOB jobs. Now, I’m not saying for a second that Azure can’t cope with this, but what if we suddenly want all of these instances to feed metrics into a single data store. There’s 190*** countries in the world; if each has a depot, then there’s ~416K messages / hour going into each Azure service. But there’s 79M / hour going into a single DB. Because it’s Christmas, let assume that Azure can’t cope with this, or let’s say that GCP is a little cheaper at this scale; or that we have some Hadoop jobs that we’d like to use on the data. In theory, we can link these systems; which might look something like this:

So, we have multiple instances of the Azure architecture, and they all feed into a single GCP service.

Disclaimer

At no point during this post will I attempt to publish 79M records / hour to GCP BigQuery. Neither will any Xamarin code be written or demonstrated – you have to use your imagination for that bit.

Proof of Concept

Given the disclaimer I’ve just made, calling this a proof of concept seems a little disingenuous; but let’s imagine that we know that the volumes aren’t a problem and concentrate on how to link these together.

Azure Service

Let’s start with the Azure Service. We’ll create an Azure function that accepts a HTTP message, updates a DB and then posts a message to Google PubSub.

Storage

For the purpose of this post, let’s store our individual instance data in Azure Table Storage. I might come back at a later date and work out how and whether it would make sense to use CosmosDB instead.

We’ll set-up a new table called Delivery:

Azure Function

Now we have somewhere to store the data, let’s create an Azure Function App that updates it. In this example, we’ll create a new Function App from VS:

In order to test this locally, change local.settings.json to point to your storage location described above.

And here’s the code to update the table:

    public static class DeliveryComplete
    {
        [FunctionName("DeliveryComplete")]
        public static HttpResponseMessage Run(
            [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)]HttpRequestMessage req, 
            TraceWriter log,            
            [Table("Delivery", Connection = "santa_azure_table_storage")] ICollector<TableItem> outputTable)
        {
            log.Info("C# HTTP trigger function processed a request.");
 
            // parse query parameter
            string childName = req.GetQueryNameValuePairs()
                .FirstOrDefault(q => string.Compare(q.Key, "childName", true) == 0)
                .Value;
 
            string present = req.GetQueryNameValuePairs()
                .FirstOrDefault(q => string.Compare(q.Key, "present", true) == 0)
                .Value;            
 
            var item = new TableItem()
            {
                childName = childName,
                present = present,                
                RowKey = childName,
                PartitionKey = childName.First().ToString()                
            };
 
            outputTable.Add(item);            
 
            return req.CreateResponse(HttpStatusCode.OK);
        }
 
        public class TableItem : TableEntity
        {
            public string childName { get; set; }
            public string present { get; set; }
        }
    }

Testing

There are two ways to test this; the first is to just press F5; that will launch the function as a local service, and you can use PostMan or similar to test it; the alternative is to deploy to the cloud. If you choose the latter, then your local.settings.json will not come with you, so you’ll need to add an app setting:

Remember to save this setting, otherwise, you’ll get an error saying that it can’t find your setting, and you won’t be able to work out why – ask me how I know!

Now, if you run a test …

You should be able to see your table updated (shown here using Storage Explorer):

Summary

We now have a working Azure function that updates a storage table with some basic information. In the next post, we’ll create a GCP service that pipes all this information into BigTable and then link the two systems.

Footnotes

* Remember, all the guys in Santa suits are just helpers.
** That brandy you leave out really hits the spot!
*** I just Googled this – it seems a bit low to me, too.

References

https://docs.microsoft.com/en-us/azure/azure-functions/functions-how-to-use-azure-function-app-settings#manage-app-service-settings

https://anthonychu.ca/post/azure-functions-update-delete-table-storage/

https://stackoverflow.com/questions/44961482/how-to-specify-output-bindings-of-azure-function-from-visual-studio-2017-preview