Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles
(untagged)

Logging with Elasticserach & Enterprise Library

0.00/5 (No votes)
25 Feb 2019 1  
In this tutorial, we will see an example of JSON format logging with Microsoft Enterprise logging, sending the logs to elasticsearch with Filebeat and use Kibana to view our logs.

Introduction

Elasticsearch is one of the best open source search engines we have today, having great abilities as a nosql document DB, which can make a great tool for application logging.

We will learn today how we can write our logs into rolling file, send the logs to elasticsearch with Filebeat, and view our logs in a beautiful way with Kibana.

Prerequisites

  • Logging with Microsoft Enterprise Library (we will use it for this example, but can do the same with other libraries like NLOG, for example)
  • Elasticsearch cluster installed and ready
  • Filebeat + Kibana installed (+optional: elasticsearch 'head' chrome extension)

Agenda

  1. Add new listener to entlib.config
  2. Create custom formatter to write json logs
  3. Configure filebeat to send the logs to elasticsearch
  4. View the logs in Kibana

1. Add New Listener to entlib.config

Create new listener of type RollingFlatFileTraceListener.

Name it Json TraceListener.

Use the new formatter Json Text Formatter.

Make sure the header and footer are empty.

Example:

<add fileName="D:\Logs\JsonLogs\rolling.log" footer=""

     formatter="Json Text Formatter" header=""

     rollFileExistsBehavior="Increment" rollInterval="Day" rollSizeKB="50000"

     timeStampPattern="yyyy-MM-dd" 

     listenerDataType="Microsoft.Practices.EnterpriseLibrary.Logging.Configuration.
     RollingFlatFileTraceListenerData, Microsoft.Practices.EnterpriseLibrary.Logging, 
     Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"

     traceOutputOptions="None" filter="All" 

     type="Microsoft.Practices.EnterpriseLibrary.Logging.TraceListeners.RollingFlatFileTraceListener, 
     Microsoft.Practices.EnterpriseLibrary.Logging, Version=5.0.414.0, Culture=neutral, 
     PublicKeyToken=31bf3856ad364e35"

     name="Json TraceListener" />

2. Create Custom Formatter to Write json Logs

Filebeat is sending logs in json format by default, separating each log written according to the new line character (\n).

In that case, we will want to write each log in a different log, and to remove all the spaces character when writing each row.

We will do it by Creating new formatter, which will include new method for cleaning white spaces:

<add template="{&#34;Timestamp&#34;:&#34;{timestamp(MM/dd/yyyy HH:mm:ss.fff)}&#34;, 
 &#34;Message&#34;:&#34;{formatForJsonValue(message)}&#34;, 
 &#34;Category&#34;:&#34;{category}&#34;, &#34;Machine&#34;:&#34;{machine}&#34;, 
 &#34;Process Id&#34;:&#34;{processId}&#34; {dictionary(, &#34;{key}&#34;: &#34;{value}&#34;)} }"

  type="Infrastructure.Logger.Formatters.CustomTextFormatter, Infrastructure.Logger"

  name="Json Text Formatter" />

CustomTextFormatter inherits from Microsoft.Practices.EnterpriseLibrary.Logging.Formatters.TextFormatter and implements the formatForJsonValue method:

For more instructions about custom formatter, check out this link.

3. Configure Filebeat to Send the Logs to elasticsearch

In filebeat.yml, add the path and the output as follows:

filebeat.prospectors:
- paths:
   - E:\temp\logs\*.log
  input_type: log
  json.keys_under_root: true
  json.add_error_key: true   

output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["localhost:9200"]
  index: "bingo-logs-%{[beat.version]}-%{+yyyy.MM.dd}"

After running filebeat, we will be able to see the logs sent to elasticsearch:

4. View the Logs in Kibana

Go to KibanaManagementIndex PatternsCreate Index Pattern

And add the index pattern same as in your filebeat.yml configuration.

Go to Discover → select your new index.

Create your own custom view (save your view for future usage):

History

  • 25th February, 2019: Initial version

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

A list of licenses authors might use can be found here