The .NET community has just seen another open-source drama. FluentAssertions, a popular library that provides a natural and easy-to-read syntax for unit tests, has suddenly changed its license. Starting with version 8, you must pay $130 per developer if you use it in commercial projects.

Group of people deciding how to monetize open-source project without making another drama

Reading the related discussions on GitHub and Reddit is painful. Most people only complain about paying for something that has always been free, completely ignoring the effort the authors had to put into the library, and any constructive critique is very rare.

Every open-source project needs significant investment to ensure it will work long-term. Keeping the project healthy and reliable requires thousands of hours of its maintainers. It is perfectly OK to want to get paid for the work. The key question is how to do it without losing trust.

We saw what happened when the author of Moq published an update that collected the e-mail addresses of developers using the library. I don't think the original intent was to do any harm. Still, the idea of a library that hooks in the build pipeline, collects information from developer machines, and sends them somewhere was far behind the red line for most users.

With FluentAssertions, we can see a similar story. The NuGet package name is the same, and you may not even notice that version 8 uses a different license when upgrading. Some people reference package versions with wildcards, which upgrades the dependencies even without making a single change in the code. Yes, people can stay on version 7 forever, but the risk of accidentally upgrading the package is quite high. Don't tell me you never clicked on the "upgrade all NuGet packages in the solution" option.

I think that making such a fundamental change deserves publishing NuGet package with a different name. Also, I believe that the change was not announced early enough, and it came out as a surprise to most of the community. Whether we like it or not, paying for using open-source projects is a sensitive topic, and communicating things clearly and in advance would significantly reduce the frustration.

I still think that the only viable business model for OSS is to have an open-source core component surrounded by commercial add-ons. The core component must be free, and it must be clearly promised it will be free forever. No catches, no tricks. This is the only way to avoid losing the trust of the users. We took this strategy with DotVVM 10 years ago and I never regretted it. We managed to acquire many loyal customers over the years, and the commercial products helped the project become sustainable.

If you maintain an open-source project, I understand you need to get paid for that. I need it, too. But please be careful when transitioning to a commercial model. Every drama like this with FluentAssertions damages the overall trust in all open-source libraries.

Today, I ran into an unusual behavior of ASP.NET Core application deployed to Azure App Service. It could not find the connection string, even though it was present on the Connection Strings pane of the App Service in the Azure portal.

This is how the screen looked like:

Screenshot of the connection string in Azure portal

The application was accessing the connection string using the standard ASP.NET Core configuration API, as shown in the following code snippet:

sevices.AddNpgsqlDataSource(configuration.GetConnectionString("aisistentka-db"));

Naturally, everything works as expected locally, but when I deployed the app to Azure, it did not start, with the exception “Host can’t be null.”

When diagnosing this kind of issues, it is a good idea to start with the Kudu console (located at https://your-site-name.scm.azurewebsites.net). A quick check of the environment variables usually shows what is wrong.

Every connection string should be passed to the application as an environment variable. Normally, the ASP.NET Core’s GetConnectionString method should look for the ConnectionStrings:connectionStringName configuration key (which is usually in the appsettings.json file or in User Secrets). Since environment variables cannot contain colons, they can be replaced with double underscores – the .NET configuration system treats these separators as equal.

However, the type field in the Azure portal (you can see it in the picture at the beginning of the article) provides a special behavior and somehow controls how the environment variable names are composed. In the case of PostgreSQL, the resulting variable name is POSTGRESQLCONNSTR_aisistentkadb. As you can see, instead of ConnectionStrings__ prefix, the prefix is POSTGRESQLCONNSTR_, and the dash from the connection string name is removed.

This was a bit unexpected for me. The GetConnectionString method cannot see the variable, but when I use the type “SQL Server”, the same approach works (though, the dashes in connection string names do not). How is this possible?

I looked in the source code of .NET and found out that there is a special treatment of these Azure prefixes, but not all of them are included. It only supports SQL Server, SQL Azure, MySQL, and Custom types. All other options will produce an incorrect name of environment variable that the application will not find.

    /// <summary>
    /// Provides configuration key-value pairs that are obtained from environment variables.
    /// </summary>
    public class EnvironmentVariablesConfigurationProvider : ConfigurationProvider
    {
        private const string MySqlServerPrefix = "MYSQLCONNSTR_";
        private const string SqlAzureServerPrefix = "SQLAZURECONNSTR_";
        private const string SqlServerPrefix = "SQLCONNSTR_";
        private const string CustomConnectionStringPrefix = "CUSTOMCONNSTR_";
...

The solution was to use the Custom type and remove the dash from the connection string name.

I once heard, “If you fear something, learn about it, disassemble it to the tiniest pieces, and the fear will just go away.

Well, it didn’t work. I read a book about building LLM from scratch, which helped me understand the model's architecture and how it works inside. However, I am still concerned about the power of AI models and the risks our world may face in the future. Although we still don't understand many fundamental concepts of how the human brain works, and some scientists say we are not even close to getting to human-level intelligence, I am still a bit worried about the scale and speed the new technologies emerge. Many inventions of science were not achieved by logical thinking or inference but by mistake or trial and error. Spawning millions of model instances and automating them to make “random” experiments to discover something new doesn’t seem that impossible to me.

The book shows how to build the smallest version of GPT-2 in Python and preload model weights published by OpenAI. By the way, GPT-3 has the same architecture, but the model is scaled to a larger number of parameters.

I was curious if this could be done in C#, and I found the TorchSharp library. It is a wrapper for native libraries used by PyTorch. The API was intentionally kept to be as close to Python as possible, so the code does not look like .NET at all. But it makes the library easy to learn and use, since a vast majority of examples are in Python. What surprised me is that the actual LLM implementation in C# has only about 200 lines of code. All the magic is in the model weights. PyTorch/TorchSharp provides a very nice abstraction over the primitives from which deep neural networks are composed.

I was wondering if it makes sense to do a session about it, for example, at our MeetUpdate. The problem is that I am not an AI scientist, and the topic is hard. I think I understand all the crucial aspects and will be able to explain what is going on. But still, there are many things I have practically no experience with. Second, understanding the session requires at least some knowledge of how neural networks work and the basics of linear algebra. I am not sure what the experience of the audience would be. And finally, I would be speaking about something that is not my creation at all - it would be merely a description of things others have invented, and my added value would only be in trying to explain it in a short meetup session.

On the other hand, playing with it was really fun, and maybe it can motivate someone to start learning more about ML and neural networks.

Shall I do it?

Crazy developer explains LLM internals to the students

My new book, “Modernizing .NET Web Applications,” is finally out - available for purchase in both printed and digital versions. If you are interested in getting a copy, see the new book’s website.

A pile of copies of my new book Modernizing .NET Web Applications

It was a challenging journey for me, but I liked every moment of it, and it is definitely not my last book. I just need to bump into another topic and get enthusiastic about it (which seems not to be that hard).

I finished the manuscript on the last day of June, but some things had already changed before the book was published. For example, the System.Data.SqlClient package became deprecated. Additionally, all samples in the book used .NET 8, but .NET 9 is just around the corner, and there will be many new and interesting features and performance improvements. Despite the fact that they do not primarily target the area of modernization, they are highly relevant - they constitute one of the strongest arguments for moving away from legacy frameworks. Chapter 2 is dedicated to the benefits of the new versions of .NET, and it is one of the longest chapters of the book. Why else would you modernize if not to use the new platform features?

To ensure you can stay updated, I’ve set up a newsletter where I’ll regularly post all the news concerning the modernization of .NET applications.

The book had an official celebration and announcement at Update Conference Prague, and I’d like to thank all the people around me who helped the book to happen.

I just returned from Chicago from Microsoft Ignite - the largest tech conference about Microsoft technology. Unsurprisingly, it was mainly about AI, and it was a terrific experience.

I got the chance to be part of the Expert Meetup zone. For each of the ten main areas of interest, there were several booths where the attendees could ask any questions. I was at the booth focused on Serverless in Azure, so we discussed Azure Functions, Azure Container Apps, and .NET Aspire, my favorite dev tool of recent months. I met many great people from the product groups who stand behind these technologies.

Since Ignite is not primarily a developer conference, most sessions were a bit further from my technical interest. However, being the CEO of RIGANTI, I want to understand how enterprises around the world implement AI, which use-cases are relevant to them, and what challenges they face. These insights greatly help us steer our company strategy and give better advice to our clients about integrating AI into their businesses.

I also attended an interesting session about the modernization of legacy systems, a topic similar to my recently published book. When speaking about this topic, I always specialized purely in .NET Framework applications, for this is what I meet most of the time. However, this session went much further in the past - most of it was about mainframes and ancient languages like COBOL or Smalltalk. It showed how to use AI to analyze or reverse-engineer the original intent of ancient code routines, how to extract business rules, build missing or incomplete documentation, and how to generate test cases to ensure the new code will provide the same behavior and functionality. This was a very different approach in contrast to my strictly deterministic and non-AI-based approach presented in the book, and I got plenty of new ideas.

Another very interesting session was about using Azure Kubernetes Service at scale. The team has been adding an impressive number of features, focusing on making Kubernetes easy to use and enhancing security - for example, by container vulnerability scanning and patching. I was amazed to see so many features and improvements they delivered recently. Apparently, AKS is a very popular service, and the product group has a chance to do so many interesting things.

And as always, I couldn't miss Mark Russinovich's session on Azure infrastructure. It conflicted with my Expert Meetup sessions, so I had to watch it from recording during my flight back, but it is thrilling to see the effort Microsoft puts in their data centers to make them energy efficient.

Tomas Herceg at Microsoft Ignite in front of GitHub logo