Introduction
Elasticsearch is one of the best open source search engines we have today, having great abilities as a nosql document DB, which can make a great tool for application logging.
We will learn today how we can write our logs into rolling file, send the logs to elasticsearch with Filebeat, and view our logs in a beautiful way with Kibana.
Prerequisites
- Logging with Microsoft Enterprise Library (we will use it for this example, but can do the same with other libraries like NLOG, for example)
- Elasticsearch cluster installed and ready
- Filebeat + Kibana installed (+optional: elasticsearch 'head' chrome extension)
Agenda
- Add new listener to entlib.config
- Create custom formatter to write json logs
- Configure filebeat to send the logs to elasticsearch
- View the logs in Kibana
1. Add New Listener to entlib.config
Create new listener of type RollingFlatFileTraceListener
.
Name it Json TraceListener
.
Use the new formatter Json Text Formatter.
Make sure the header and footer are empty.
Example:
<add fileName="D:\Logs\JsonLogs\rolling.log" footer=""
formatter="Json Text Formatter" header=""
rollFileExistsBehavior="Increment" rollInterval="Day" rollSizeKB="50000"
timeStampPattern="yyyy-MM-dd"
listenerDataType="Microsoft.Practices.EnterpriseLibrary.Logging.Configuration.
RollingFlatFileTraceListenerData, Microsoft.Practices.EnterpriseLibrary.Logging,
Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
traceOutputOptions="None" filter="All"
type="Microsoft.Practices.EnterpriseLibrary.Logging.TraceListeners.RollingFlatFileTraceListener,
Microsoft.Practices.EnterpriseLibrary.Logging, Version=5.0.414.0, Culture=neutral,
PublicKeyToken=31bf3856ad364e35"
name="Json TraceListener" />
2. Create Custom Formatter to Write json Logs
Filebeat is sending logs in json format by default, separating each log written according to the new line character (\n
).
In that case, we will want to write each log in a different log, and to remove all the spaces character when writing each row.
We will do it by Creating new formatter, which will include new method for cleaning white spaces:
<add template="{"Timestamp":"{timestamp(MM/dd/yyyy HH:mm:ss.fff)}",
"Message":"{formatForJsonValue(message)}",
"Category":"{category}", "Machine":"{machine}",
"Process Id":"{processId}" {dictionary(, "{key}": "{value}")} }"
type="Infrastructure.Logger.Formatters.CustomTextFormatter, Infrastructure.Logger"
name="Json Text Formatter" />
CustomTextFormatter
inherits from Microsoft.Practices.EnterpriseLibrary.Logging.Formatters.TextFormatter
and implements the formatForJsonValue
method:
For more instructions about custom formatter, check out this link.
3. Configure Filebeat to Send the Logs to elasticsearch
In filebeat.yml, add the path and the output as follows:
filebeat.prospectors:
- paths:
- E:\temp\logs\*.log
input_type: log
json.keys_under_root: true
json.add_error_key: true
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]
index: "bingo-logs-%{[beat.version]}-%{+yyyy.MM.dd}"
After running filebeat, we will be able to see the logs sent to elasticsearch:
4. View the Logs in Kibana
Go to Kibana → Management → Index Patterns → Create Index Pattern
And add the index pattern same as in your filebeat.yml configuration.
Go to Discover → select your new index.
Create your own custom view (save your view for future usage):
History
- 25th February, 2019: Initial version