Tag Archives: Binary

Transmitting an Image via Azure Service Bus

This post in a continuation of this earlier post around serialising an image. Once you’re able to serialise an image, then you can transmit that image. In this post, we’ll see how we could do that using Azure Service Bus.

A Recap on Image Serialisation

In the previous (linked) post, we saw how we can turn an image into a string of text, and back again. Essentially, what we did was use a BinaryWriter to write the file as a binary stream, and a BinaryReader to turn it back to an image.

The plan here is to do exactly the same thing, but to simply write the stream to an Azure Service Bus Message, and read from it at the other side.

Size Matters

One thing that you’re going to need to be aware of here is the size of the image. The free tier of Service Bus limits you to 256KB. Serialising an image to stream can be less than that, but unless you know different, you should assume that it will be bigger. Even after you sign up for the much more expensive premium tier, when you set-up the topic or queue, you’ll need to specify a maximum size. Images can be very big!


To be clear: the purpose of this post is to demonstrate that you can transmit an image via Service Bus – not that you should. There are other ways to do this: for example, you could upload the image to blob storage, or an S3 bucket, and then just point to that in the message. The one advantage transmitting the image with the message does give you, is that the binary data lives and dies with the message itself – so depending on what you do with the message after receipt, that may be an advantage.

Transmitting the Message

The following is a simple helper method that will send the message for us:

async Task SendMessage(string connectionString, string topicName, string messageText)
    int byteCount = System.Text.ASCIIEncoding.ASCII.GetByteCount(messageText);

    await using var serviceBusClient = new ServiceBusClient(connectionString);
    var sender = serviceBusClient.CreateSender(topicName);
    var message = new ServiceBusMessage(Encoding.UTF8.GetBytes(messageText));

    await sender.SendMessageAsync(message);

byteCount tells you how big the message is going to be. Useful when you’re writing a blog post about such things, but you may find it useful for debugging.

The rest is pretty basic Azure Service Bus SDK stuff. I’ve written about this previously in more depth.

We can then combine this with the code from the last post:

        var serialisedToSend = SerialiseImageToBinary(
        await SendLargeMessage(connectionString, "image-message", serialisedToSend);

The next step is to receive the message at the other side.

Consuming the Message

As with the send, we have a helper method here:

async Task<string> ConsumeNextMessage(string topic)
    var serviceBusClient = new ServiceBusClient(connectionString);
    var receiver = serviceBusClient.CreateReceiver(topic, "sub-1");
    var message = await receiver.ReceiveMessageAsync();
    return message.Body.ToString();

Again, I’ve written about receiving a message using the SDK before.

Here, we do the reverse of the send:

        var serialisedToReceive = await ConsumeNextMessage("image-message");


I’ve used the date so that I could test this multiple times – other than that, we’re receiving the serialised image, and then writing it to disk.


I’ll re-iterate the sentiment that I’m not advocating this as a good or preferable way to transmit images, simply highlighting that it is possible with relatively little work.

Serialising and De-serialising Images

For another project that I’m working on, I needed to transfer an image as text. In fact, that’s actually quite an easy thing to do in .Net. The way is to use a BinaryWriter and BinaryReader.

Binary Serialisation Helpers

The first step is to create a helper to serialise or de-serialise anything to or from binary. We’ll start with Serialise:

string SerialiseToBinary<TData>(TData data)
    string serialised = JsonSerializer.Serialize(data);

    using var stream = new MemoryStream();
    using var binary = new BinaryWriter(stream);

    stream.Position = 0;
    return Convert.ToBase64String(stream.ToArray());

We start by simply calling the JSON serialise method – that way, anything that’s passed in is now a string. Then we need a stream to write to – in fact, you could do this directly to a file, but here we’re just using a memory stream. Finally we convert to Base64 (Conversion to Base 64 isn’t strictly necessary for this, but if you want to send the binary data anywhere afterwards then it may be).

Next is the Deserialise method:

TData? DeserialiseFromBinary<TData>(string data)
    byte[] b = Convert.FromBase64String(data);
    using var stream = new MemoryStream(b);
    using var br = new BinaryReader(stream);

    stream.Seek(0, SeekOrigin.Begin);
    var resultString = br.ReadString();
    var result = JsonSerializer.Deserialize<TData>(resultString);
    return result;

Here we’re converting from Base 64 – which obviously is necessary if you’ve encoded with it on the other side. Then we have the same in reverse – using the binary reader to read the memory stream, and then de-serialise the result.

Writing and Reading from Files

Now that we can write and read binary data, we can add a couple of other helpers to do this from a file – for this experiment, I’m simply copying an image:

string SerialiseImageToBinary(string path)
    var bytes = File.ReadAllBytes(path);
    var serialised = SerialiseToBinary(bytes);
    return serialised;

void DeserialiseImageFromBinary(string data, string outPath)
    var deserialised = DeserialiseFromBinary<byte[]>(data);
    using var fs = new FileStream(outPath, FileMode.CreateNew);

Probably not worth going into too much detail on these methods: they simply read and write from files, calling our other helper methods. I then call them like this:

        var serialised = SerialiseImageToBinary(


I’ll be coming back to these methods in future posts around transmitting this information over Azure Service Bus.

Insert a Stream into SQL

I have written a couple of articles around this; relating to transmitting large files over WCF and enabling filestream in SQL. This article deals with actually inserting one of those large files into the DB and retrieving it back out again.

The following method does not use FILESTREAM; that requires a slightly different syntax.

The Database

If you have a look at the linked articles, you’ll already have seen how the data that I’m dealing with is arranged; however, here’s a create statement for the table; just in case you want to try this:

CREATE TABLE [dbo].[BinaryDataTest](
	[ROWGUID] [uniqueidentifier] ROWGUIDCOL  NOT NULL,
	[DataName] [nchar](10) NOT NULL,
	[Data] [varbinary](max) FILESTREAM  NULL,
	[Data2] [varbinary](max) NULL,

For completeness, my DB is called TestDB.

You’ll notice that `Data` uses FILESTREAM. However, I won’t cover that in this post.

The Service

Here’s an example of how you would write the insert statement in your service (the same method should work whether or not a service is used):

        public void InsertData(Stream value)
            string connectionString = ConfigurationManager.ConnectionStrings["TestDB"].ConnectionString;
            using (SqlConnection cn = new SqlConnection(connectionString))
            using (SqlCommand cmd = cn.CreateCommand())

                cmd.CommandText = $"INSERT INTO [dbo].[BinaryDataTest] (" +
                    "[ROWGUID],[DataName],[Data2] ) " +
                    "VALUES (NEWID(), 'test', @DataVarBinary)";
                cmd.CommandType = System.Data.CommandType.Text;

                MemoryStream newStream = new MemoryStream();
                SqlParameter sqlParameterBin = new
                    SqlParameter("@DataVarBinary", SqlDbType.VarBinary);
                sqlParameterBin.Value = new SqlBytes(newStream);


As you can see, I have a connection string called “TestDB”; other than that, I think the only remarkable thing (that is: thing worthy of remark – not astounding) is the SqlParameter set-up. Use the VarBinary SQL type, and the ADO.NET SQL function SqlBytes(), and you’re good to go.

Next, there’s the data retrieval:

        public Stream GetData(string dataName)
            string connectionString = ConfigurationManager.ConnectionStrings["TestDB"].ConnectionString;
            using (SqlConnection cn = new SqlConnection(connectionString))
            using (SqlCommand cmd = cn.CreateCommand())
                cmd.CommandText = "SELECT [ROWGUID],[DataName],[Data],[Data2]" +
                    " FROM [dbo].[BinaryDataTest]" +
                    " WHERE DataName = @DataName";
                cmd.CommandType = System.Data.CommandType.Text;
                cmd.Parameters.AddWithValue("DataName", dataName);
                using (var rdr = cmd.ExecuteReader())
                    while (rdr.Read())
                        Stream str = rdr.GetStream(rdr.GetOrdinal("Data2"));

                        return str;

            throw new Exception("Invalid data");

A familiar looking idea. You’ll see that I’m only returning that `Data2` as stated earlier, and am using the SqlDataReader.GetStream() function.

The Client

I’m deliberately missing out the configuration that enables you to send these files, and which is documented here.

Here’s the Main() function of a client console app:

        static void Main(string[] args)
            ServiceReference1.Service1Client svc = new ServiceReference1.Service1Client();
            Stream stream = File.OpenRead(@"c:\tmp\test.bmp");


            Stream strDest = File.OpenWrite(@"c:\tmp2\testdestination.bmp");
            Stream str2 = svc.GetData("test");                        



So, we’re reading a file from c:\tmp into a stream, and sending that, via WCF into the SQL DB. Then, we’re reading that back out of the SQL DB, and sending it back over to the client. The client then writes this out to a file.

I fully intend to cover how this differs in a FILESTREAM column in a later post.

Setting up SQL Server to use the FILESTREAM feature

Whilst playing about with this feature of SQL Server, I encountered the above error. This post should lead you around the error. It does not make any claims as to whether using the FILESTREAM feature is a good, or bad idea.

The error:

Msg 1969, Level 16, State 1, Line 14
Default FILESTREAM filegroup is not available in database ‘TestDB’.

The table create statement that caused this was:

CREATE TABLE [dbo].[BinaryDataTest2](
	[DataName] [nchar](10) NOT NULL,
	[Data] [varbinary](max) FILESTREAM NULL,
	[Data2] [nchar](10) NULL

I have to be honest, and say that I did, initially try to create this through the UI designer. this thread put me straight on that.

So, the next stage was to create a filegroup with the FILESTREAM enabled, but if you do that now, you’ll (likely) get the following error:

Msg 5591, Level 16, State 3, Line 1
FILESTREAM feature is disabled.

This is a property of the SQL Server instance, not the DB:



Next, run SQL Configuration Manager and enable FILESTREAM here as well.


(found here on MSDN)

Finally, add a file group with FILESTREAM enabled:

    NAME= 'filestream',  
    FILENAME = 'C:\db\fs' 
TO FILEGROUP fs_fg_filestream  

Obviously, replace “C:\db\fs” with an actual location on your hard-drive.

The next error I got was:

A table with FILESTREAM column(s) must have a non-NULL unique ROWGUID column.

Okay, so you need to assign a field at a unique identifier:

CREATE TABLE [dbo].[BinaryDataTest](
	[DataName] [nchar](10) NOT NULL,
	[Data] [varbinary](max) FILESTREAM NULL,
	[Data2] [nchar](10) NULL

Finally, insert some data into your table:

INSERT INTO BinaryDataTest(ROWGUID, DataName, Data, Data2) 
Values (NEWID()
      , 'test'
      , Convert(varbinary,'test')
	  , null

If you have a look at your (equivalent of) “c:\db”, you’ll see exactly what the effect of this was:


Basically, you are now storing the data from the DB in the file system. I re-iterate, I make no claims that this is a good or bad thing, just that it is a thing.

Sending Binary Files Over WCF

This is an interesting one – it is possible to load a binary file (such as an exe) in a .NET service and return it to the client via WCF. It’s actually not that complex either; in this example, I’ve created a basic service and console application.


The code for the service is really straight forward:

    public class Service1 : IService1
        public Stream GetData(int value)
            FileStream stream = File.OpenRead(@"c:\tmp\MyFile");
            return stream;

And here is the Interface:

    public interface IService1

        Stream GetData(int value);

The important part about the above code is that the method must return a Stream, and can only accept a single parameter (you can send large data by reversing this).

The client code is a little more involved. The Main() function of the console app is here:

        static void Main(string[] args)
            Task rd = ReceiveData();

And ReceiveData() looks like this:

        private static async Task ReceiveData()
            ServiceReference1.Service1Client sc = new ServiceReference1.Service1Client();
            Stream stream = await sc.GetDataAsync();            

            using (var fs = File.Create(@"c:\tmp2\newfile.exe"))
                int b;
                    b = stream.ReadByte();
                } while (b != -1);

As you can see, although the code is a bit messy, it’s not complex, and the bulk of the code is actually turning the Stream into a file (I’m sure there’s an easier way of doing this).


The config files are the key here. The encoding and transfer mode need to be changed.

The service web.config:

        <binding name="BasicStreaming" messageEncoding="Mtom" transferMode="Streamed"


Create an endpoint for the service and bind it to above:

      <service name="FileService.Service1">
        <endpoint address="Service1.svc" binding="basicHttpBinding"
          bindingConfiguration="BasicStreaming" contract="FileService.IService1" />

So, the transfer mode is streamed, and the encoding is ‘Mtom’. Make sure that the endpoint is configured against the binding (otherwise it’ll use the default binding, repeatedly moan that it’s out of memory and mismatched for no apparent reason and you’ll spend ages wondering why).

The client config can be updated using the “Update Service Reference” option; however, double check what it adds.

The App.config changes from the client:

                <binding name="BasicStreaming" closeTimeout="10:00:00" 
                    maxReceivedMessageSize="400000000" messageEncoding="Mtom" />
            <endpoint address="http://localhost:17065/Service1.svc/Service1.svc"
                binding="basicHttpBinding" bindingConfiguration="BasicStreaming"
                contract="ServiceReference1.IService1" name="BasicHttpBinding_IService1" />

The only real thing of note in the client is the `maxReceivedMessageSize`; this has to be big enough to take your largest file. Obviously, the timeout settings matter, but maybe if you’re taking 10 minutes to transfer then you’ve got bigger problems.

The End

That’s it – it all works neatly, and looks deceptively easy!