Installing Elasticsearch as a service is an easy task:
This creates a new service, which should be started manually.
However, I faced a problem installing Kibana 5.5.0 as a Windows service. The first solution I found didn’t work.
Read More →
In this article, I’m going to share my experience with Oracle ATG logging system and such great tools like Elasticsearch and Kibana. Imagine that we have a big ATG-based project and our task is to store and list all the users who met (explicitly or not) errors. To solve this task we will add a new event listener to filter errors, will face a problem with resolving users’ profile from the listener, will send an error report to an Elasticsearch server and finally display the data with Kibana.
Although this article describes creating ATG-specific logging feature, it will be quite easy to adjust this tutorial for using with Slf4j or whatever you want.
Elasticsearch and Kibana are part of so-called ELK stack. Letter “L” stands for Logstash, which is a great tool that crawls data from multiple sources, transforms it and sends to a target (e.g. file or Elasticsearch). However, I decided not to use Logstash here, because Elasticsearch provides a good API which is enough for this task.
So, let’s get started.