Logging using Serilog and ELK stack

Logging is one of the key concerns of software developers. This is a source of information about application health and performance. From a business point of view it helps to analyze application usage and understand KPIs.
I'm going to share an example of structured logging in .Net Core applications using Serilog, and log data ingestion into ELK stack (Elasticsearch, Logstash, Kibana) for analysis.

Generating Logs

Let's get started with an application generating structured log events. It seems to be a straightforward task when using right tools like Serilog. To start, let's create a new project using dotnet new console. Then we are going to add a couple of Serilog packages:

  • Serilog
  • Serilog.Sinks.Http

Serilog is a great logging framework with a simple API. It takes care of internal "sink-agnostic" object representation and works as a serializer based on Message template syntax
The API has a concept of Sinks implementing transport for logs to get them to various receivers, such as console, files, databases, services. There are lots of sinks implemented. Please check the available Sinks.

In this example, we are going to use Serilog.Sinks.Http. It buffers log events and sends them out to an endpoint. The endpoint is Logstash HTTP input plugin listening for messages on port 8080.

Serilog API configuration looks like this:

var log = new LoggerConfiguration()
         .WriteTo.Console()    
         .WriteTo.Http("http://localhost:8080")
         .CreateLogger();

Our application generates some faked customer registrations and logs this data continiously.

while (true)
{
    var customer = Customer.Generate();                                
    log.Information("{@customer} registered", customer);                
    Thread.Sleep(1000);
}

This code will produce logs serialized in JSON:

...
[13:56:02 INF] {"FirstName": "Lourdes", "LastName": "Kreiger", "SSNumber": "350-11-7869", "$type": "Customer"} registered
[13:56:03 INF] {"FirstName": "Desmond", "LastName": "Balistreri", "SSNumber": "929-58-1854", "$type": "Customer"} registered
...

So far we've got a simple logging of customer data to the console and the Http endpoint running on http://localhost:8080.
Next, we are going to setup ELK and start processing the logs.

Processing logs in ELK

ELK stack built by Elastic includes:

  • Elasticsearch
  • Logstash
  • Kibana

It is a highly scalable open-source engine capable of storing and analyzing a high volume of text data. The main use-cases are full-text search and logging. Logstash is handy for data collection, transformation and feeding into Elasticsearch. Once the data is in there, we can run searches and aggregations to mine any information that is of our interest. Kibana is good for building visual dashboards representing the data from Elasticsearch. It's worth considering ELK for logs processing.

So let's get the stack up and running. One of the easiest ways of doing this is to use Docker images of ELK. I'm assuming the reader is familiar with the concept of containerised applications and Docker as we are not going to dive into this here.
We will use the repository on GitHub having ELK v 6.0 Docker images. With the help of docker-compose, it's just a matter of a command to start up the ELK locally (check the details on the link above). Indeed, Docker is a great time saver. Let's slightly customize Logstash docker image and add an instruction to install logstash-input-http. When we've got ELK running, let's add a Logstash pipeline for reading, transforming and sending logs to Elasticsearch

The pipeline is as below:

# Http input listening port 8080
input {
	http {	
		#default host 0.0.0.0:8080
		codec => json
	}
}

# Separate the logs
filter {
 	split {
		field => "events"
		target => "e"
		remove_field => "events"
	}
}

# Send the logs to Elasticsearch
output {
	elasticsearch {
		hosts => "elasticsearch:9200"
		index=>"customer-%{+xxxx.ww}"
	}
}

There are three sections defined.

  • Input. We have got an Http input plugin listening events on port 8080. It uses JSON codec for data deserialization. Currently all supported plugins are listed here
  • Filter. Plugins designed for data transformation. We have just defined a 'split' filter for the logs because Serilog.Sink.Http sink sends logs in batches.
  • Output. We have defined Elasticsearch as the output. The plugin transmits our logs to the Elasticsearch index 'customer-%{+xxxx.ww}'. The supported list is here

After Logstash restart we have got our logs processing OK.

Let's launch Kibana UI running on http://localhost:5601 and check the logs. When we start the console application and have Kibana discovery window open with auto refresh on, there will be new log events coming in. 2018-01-25-22_14_06-Kibana
An example of log event sent by Serilog is below:

...
"e": {
      "MessageTemplate": "{@customer} registered",
      "Level": "Information",
      "RenderedMessage": "Customer { FirstName: \"Tressa\", LastName: \"King\", SSNumber: \"702-51-3003\" } registered",
      "Properties": {
        "customer": {
          "LastName": "King",
          "SSNumber": "702-51-3003",
          "FirstName": "Tressa",
          "_typeTag": "Customer"
        }
      },
      "Timestamp": "2018-01-25T22:15:17.5280131+00:00"
    }
...

Here you can see how Serilog serializes logged entities.
In Kibana we can query the index 'customer-*' and visualize our logs. Kibana has a rich toolset for building charts which is out of the scope of this article.

Summary

We have got a scalable logging solution for .Net applications using ELK stack and structured logging. In my opinion, this is a solid basis for enterprise-level logging. Please try it out yourself on GitHub and let's discuss.

Roman Pavlov

Software Developer building payment solutions

London, UK