Category Archives: .Net

Beginner’s Guide to Docker – Part 4 – Viewing Docker Logs

In this post (part of a series around docker basics) I’ll be disucssing how you can view logs for running docker containers.

As usual with my posts, there’s very little here that you can’t get from the the docs – although I’d like to think these posts are slightly more accessible (especially for myself).

Viewing Logs

In fact, the process of viewing logs is remarkably easy; once you’ve launched the container, you need the ID:

docker ps

Once you have the container ID, you can feed it into the docker container logs command:

docker container logs --follow dcd204c97421

The follow argument basically leaves a session open, and updates with any new log entries (leave it out for a static view of the logs).

The docker desktop app actually provides the same functionality out of the box (but provides you with a nice search and formatting facility).

Writing Logs

Obviously, this varies from language to language, so I’ll stick to .Net here. It’s actually very straightforward:

builder.Services.AddLogging();

You can now log in the same way as you would for any other .net application; simply inject into the constructor:

        public MyService(
            ILogger<MyService> logger)
        {
            _logger = logger;
        }

And then log as you would normally:

_logger.LogWarning("something happened");

References

Constructor Injection in .Net

Docker Container Logs

Running Hangfire with .Net 6 and Lite Storage

One of the things that I’ve been looking into recently is Hangfire. I’ve used this in the past, but not for a while, and so I wanted to get familiar with it again. I imagine there’ll be a couple more posts on the way, but in this one, I’m going to detail how to get Hangfire running using .Net 6, and a Lite DB Storage. I did initially intend to create this with a SQLite DB; however, when I did, I found that the Hangfire messages were sending multiple times; after some searching, I came across this GitHub issue, which suggested the Hangfire.Lite storage.

In this particular article, we’ll look specifically at running Hangfire with an Asp.Net MVC project. In .Net 6, the default template looks like this:

The first step here is that you’ll need some additional refences; here’s the csproj:

<Project Sdk="Microsoft.NET.Sdk.Web">

	<PropertyGroup>
		<TargetFramework>net6.0</TargetFramework>
		<Nullable>enable</Nullable>
		<ImplicitUsings>enable</ImplicitUsings>
	</PropertyGroup>

	<ItemGroup>
		<PackageReference Include="Hangfire.AspNetCore" Version="1.7.27" />
		<PackageReference Include="Hangfire.Core" Version="1.7.27" />
		<PackageReference Include="Hangfire.LiteDB" Version="0.4.1" />
	</ItemGroup>

</Project>

In addition to the Hangfire Core and AspNetCore projects, there’s the LiteDB package.

The next step is, in program.cs:

using Hangfire;
using Hangfire.LiteDB;

var builder = WebApplication.CreateBuilder(args);

// Add services to the container.
builder.Services.AddControllersWithViews();

// HF
builder.Services.AddHangfire(configuration =>
{
    configuration.UseLiteDbStorage("./hf.db");    
});
builder.Services.AddHangfireServer();
// HF - End

var app = builder.Build();

// Configure the HTTP request pipeline.
if (!app.Environment.IsDevelopment())
{
    app.UseExceptionHandler("/Home/Error");
    // The default HSTS value is 30 days. You may want to change this for production scenarios, see https://aka.ms/aspnetcore-hsts.
    app.UseHsts();
}
. . .

That’s actually it – that’s all you need to add Hangfire! However, it won’t, technically speaking, do anything. To test, edit the HomeCrontroller.cs:

        public IActionResult Index()
        {
            var id = BackgroundJob.Enqueue(() => DoSomething());
            return View();
        }

        public void DoSomething()
        {
            Console.WriteLine("Test");
            Console.WriteLine($"test2 {Guid.NewGuid()}");
        }

When you run the project, you should see Hangfire fire up and start printing outputs:

When you run it, don’t be too impatient – but it should print the test output.

References

https://github.com/HangfireIO/Hangfire/issues/1025

https://docs.microsoft.com/en-us/aspnet/core/fundamentals/routing?view=aspnetcore-6.0

https://docs.hangfire.io/en/latest/getting-started/index.html

Manually Parsing a Json String Using System.Text.Json

In this post from 2016 I gave some details as to how you could manually parse a JSON string using Newtonsoft JSON.NET. What I don’t think I covered in that post, was why you may wish to do that, nor did I cover how you could do that using System.Text.Json (although since the library was only introduced in .Net Core 3 – circa 2019, that part couldn’t really be helped!)

Why?

Let’s start with why you would want to parse a JSON string manually – I mean, the serialisation functions in, pretty much, any JSON library are really good.

The problem is coupling: by using serialisation, you’re coupling your data to a given shape, and very tightly coupling it to that shape, too. So much so, that a common pattern if you’re passing data between two services, and using serialisation, it to share the model class between the services. This sounds quite innocuous at first glance, but let’s consider a few factors (I’m assuming we’re talking exclusively about .Net, but I imagine the points are valid outside of that, too):

1. Why are you serialising and de-serialising the data to begin with?
2. Single Responsibility Principle.
3. Microservices.

Let’s start with (1). Your answer may be different, but typically, if you’re passing data as a string, it’s because you’re trying to remove a dependency to a given complex type. After all, a string can be passed anywhere: an API call, a message broker, even held in a database.

What has this got to do with the SRP (2)? Well, the SRP is typically used to describe the reason that a module has to change (admittedly it is slightly mis-named). Let’s see how the two modules may interact:

Now, let’s look at the interaction with a common module:

As you can see, they both have a dependency on a single external (external to the service) dependency. If the CustomerModel changes, then both services may also need to change, but they also need to change for alterations for business rules that relate to the module itself: so they now have two reasons to change.

Of course, you don’t have to have a common dependency like this; you could structure your system like this:

However, you don’t solve your problem – in fact, you arguably make it worse: if you change CustomerModel referenced by Service 1 you potentially break Service 2, so you now need to change CustomerModel referenced by Service 2, and Service 2!

Just to clarify what I’m saying here: there may be valid reasons for both of these designs – but if you use them, then be aware that you’re coupling the services to each other; which brings us to point (3): if you’re trying to create a Service Oriented Architecture of any description, then this kind of coupling may hinder your efforts.

The Solution

A quick caveat here: whatever you do in a system, the parts of that system will be coupled to some extent. For example, if you have built a Microservice Architecture where your system is running a nuclear reactor, and then you decide to change one of the services from monitoring the cooling tower to, instead, mine bit-coins, you’re going to find that there is an element of coupling. Some of that will be business coupling (i.e. the cooling tower may overheat), but some will be technical – a downstream service may depend on the monitoring service to assert that something is safe.

Apologies, I know absolutely nothing about nuclear reactors; or, for that matter, about mining bit-coin!

All that said, if you manually parse the JSON that’s received, you remove some dependency on the shape of the data.

The following reads a JSON document, and iterates through an array:

            using var doc = JsonDocument.Parse(json);
            var element = doc.RootElement;

            foreach (var eachElement in element.EnumerateArray())
            {
                string name = eachElement.GetProperty("Name").GetString();
                decimal someFigure = eachElement.GetProperty("SomeFigure").GetDecimal();

                if (someFigure > 500)
                {
                    Console.WriteLine($"{name} has more than 500!");
                }
            }

As you can see, if the property name of SomeFigure changed, the code would break; however, there may be a dozen more fields in each element that could change, and we wouldn’t care.

Compiling Code Files in C# Using Roslyn

I started investigating this for a unit test. I wanted to compile a separate project, and then copy the output into the test directory. As it happens, by the time I’d figured this out in Roslyn, I’d worked out it probably wasn’t right for the test that I was writing; however, given the amount of effort involved in actually piecing together the tiny amount of documentation available, I definitely thought it was worthy of documenting.

This particular post will focus on reading code files and compiling them into an executable or library. I may play around with this some more in the future: I suspect this might have quite a lot of mileage in the unit test space.

Dependencies

Let’s start with the dependencies:

    <PackageReference Include="Microsoft.CodeAnalysis.CSharp" Version="3.11.0" />
    <PackageReference Include="Microsoft.CodeDom.Providers.DotNetCompilerPlatform" Version="3.6.0" />

It’s worth bearing in mind that there are a few other libraries knocking about from previous incarnations of Roslyn, and similar tools; however, it’s my understanding that these are not in the Microsoft namespace. I probably spent the first couple of hours trying to get this to work with CSharpCodeProvider. If you found your way here searching for this error:

System.PlatformNotSupportedException: ‘Operation is not supported on this platform.’

Then you’re in good company.

Code

The following code will produce a library called qwerty.dll; it requires a variable giving the sourceFilesLocation and another, destinationLocation:

	// 1
            DirectoryInfo d = new DirectoryInfo(sourceFilesLocation); 
            string[] sourceFiles = d.EnumerateFiles("*.cs", SearchOption.AllDirectories)                
                .Select(a => a.FullName).ToArray();
            
	// 2
            List<SyntaxTree> trees = new List<SyntaxTree>();
            foreach (string file in sourceFiles)
            {
                string code = File.ReadAllText(file);
                SyntaxTree tree = CSharpSyntaxTree.ParseText(code);
                trees.Add(tree);
            }
            
	// 3
            MetadataReference mscorlib =
                    MetadataReference.CreateFromFile(typeof(object).Assembly.Location);
            MetadataReference codeAnalysis =
                    MetadataReference.CreateFromFile(typeof(SyntaxTree).Assembly.Location);
            MetadataReference csharpCodeAnalysis =
                    MetadataReference.CreateFromFile(typeof(CSharpSyntaxTree).Assembly.Location);

            MetadataReference[] references = { mscorlib, codeAnalysis, csharpCodeAnalysis };

	// 4
            var compilation = CSharpCompilation.Create("qwerty.dll",
                   trees,
                   references,
                   new CSharpCompilationOptions(OutputKind.DynamicallyLinkedLibrary));
            var result = compilation.Emit(Path.Combine(destinationLocation, "qwerty.dll"));

Let’s break the code down (you may notice that there are 4 comments, that to the untrained eye, appear useless).

1. The first part scans the source files directory and returns a list of files.
2. We then build a syntax tree list by reading all the files and calling the ParseText method on each; they are added to a list of parsed files.
3. We now build up a list of references needed to compile the code.
4. Finally, we create the dll, passing in the files that we built up in 2 & 3. .Create creates an in-memory version of the dll, so we then need to call .Emit to write it to disk.

Summary and caveats

This is a very simplistic example – as I’ve said, I need to spend some more time looking at some of the more advanced cases, but if you have a very simple library, this will compile and produce the DLL for you.

References

https://stackoverflow.com/questions/54364785/c-sharp-compile-c-sharp-code-at-runtime-with-roslyn

https://blog.dangl.me/archive/integration-testing-in-memory-compiled-code-with-roslyn/

https://www.tugberkugurlu.com/archive/compiling-c-sharp-code-into-memory-and-executing-it-with-roslyn

https://docs.microsoft.com/en-us/dotnet/csharp/roslyn-sdk/get-started/syntax-transformation

https://stackoverflow.com/questions/32769630/how-to-compile-a-c-sharp-file-with-roslyn-programmatically