Logging in Windows Azure can be done through Windows Azure Diagnostics. This solution collects a ton of detailed data that can be hard to parse through. I recently needed a close to real-time trace of what my Roles were doing. My current project has many instances with many independent services running in parallel, resulting in a challenge when I try to trace using Windows Azure Diagnostics. Log4Net and Enterprise Library offer amazing tools to accomplish what I’m after. But they do so with so much detail and data, that we often need to resort to parsing tools and third party applications to extract meaningful information. I needed something quick, lightweight and that didn’t cost too much to operate.
At first, I was trying to follow what my instances were up to using the Windows Azure Compute Emulator. This wasn’t what I was looking for, because local environments don’t run exactly like the production or staging environments on the cloud. I spent a few minutes thinking about logging and costs related to Windows Azure Storage transactions and came up with the solution described below.
The code from this Post is part of the Brisebois.WindowsAzure NuGet Package
To install Brisebois.WindowsAzure, run the following command in the Package Manager Console
PM> Install-Package Brisebois.WindowsAzure
A sample project containing the log viewer can be found on GitHub repository ”Windows Azure Logger” .
The Logger is a static class that accumulates log entries and inserts them into a Windows Azure Table Storage Service in batches. Keeping operational costs to a minimum is achieved by inserting entries in batches of 100 per Table Partition.
Adding Log Entries
Logger.Add("ServiceName", "EventName", "Details"); Logger.Add("Worker", "Start", DateTime.UtcNow.ToString(CultureInfo.InvariantCulture)); Logger.Add("Worker", "Sleep", "for 1 seconds"); Logger.Add("WebFront", "Error", exception.ToString()); Logger.Add("Worker", "Stop", DateTime.UtcNow.ToString(CultureInfo.InvariantCulture));
Persisting Accumulated Log Entries
Persisting accumulated log entries is achieved in two ways. If one of 3 conditions are met the logger will persist. Further more you can force it to persist through code.
Persistence is achieved by satisfying one of these conditions
- The logger has accumulated 100 messages
- There is a 20 second gap between the current message and the previous message
- The Logger is forced to persist through code
Forcing persistence through code
Logger.Persist(true);
As a best practice I strongly recommend forcing the persistence of the log when your Worker Roles stop and when your Web Roles stop.
public override void OnStop() { Logger.Add("Worker", "Stop", DateTime.UtcNow.ToString(CultureInfo.InvariantCulture)); Logger.Persist(true); //This is a delay to allow the service to stop gracefully. Thread.Sleep(TimeSpan.FromMinutes(3)); base.OnStop(); }
Log Viewer
The log viewer is an MVC 4 base page that is refreshed every few seconds. This is quite practical when you want to follow what’s happening in your environments. This page uses the following code to query for the latest entries from the Windows Azure Table Storage Service.
public async Task<ActionResult> Index() { //Add your service names here var partitions = new[] { "Worker", "WebFront" }; var entries = await TableStorageLogger.Logger.Get(10, partitions); return View(entries.OrderByDescending(e=> e.Created)); }
Be sure to add the service names that you are using when you log to the array of partitions. These will be used when the Logger queries the Windows Azure Table Storage Service for the entries. Service names are used as table partitions. This allows you to read entries for subset of available services helping you concentrate your efforts when debugging.
Configuration
Be sure your Cloud Storage Account Connection String in your Web.Cong or in your Role Cloud Configurations.
Summary
I have been testing this solution for a while now and have been getting great results. It has helped me identify and fix many performance issues. Using this kind of logging has its drawbacks and its benefits. One of the major benefits I’ve identified so far, is that its easy to read, easy to use and easy to cleanup.
This logger may be too simple for production diagnostics, it it’s great for development purposes! It allows me to monitor my services in near real-time from my computer or other browser enabled devices. I’m able to follow my test runs during my nightly bus rides.
Take some time to test it out, leave your comments and if something is off or missing please submit pull requests on GitHub.
Get the code from https://github.com/brisebois/WindowsAzureLogger
Filed under: Microsoft Azure, Table Storage Tagged: Batching, C#, Enterprise Library, Log4Net, Logging, Microsoft Azure, Monitoring, Partition, Roles, Table Storage