A personal page, where I describe my work and life moments
In this article, I’m going to share my experience with Oracle ATG logging system and such great tools like Elasticsearch and Kibana. Imagine that we have a big ATG-based project and our task is to store and list all the users who met (explicitly or not) errors. To solve this task we will add a new event listener to filter errors, will face a problem with resolving users’ profile from the listener, will send an error report to an Elasticsearch server and finally display the data with Kibana.
Although this article describes creating ATG-specific logging feature, it will be quite easy to adjust this tutorial for using with Slf4j or whatever you want.
Elasticsearch and Kibana are part of so-called ELK stack. Letter “L” stands for Logstash, which is a great tool that crawls data from multiple sources, transforms it and sends to a target (e.g. file or Elasticsearch). However, I decided not to use Logstash here, because Elasticsearch provides a good API which is enough for this task.
So, let’s get started.
I’m currently studying Oracle ATG. Kinda big monstrous eCommerce platform. Today I had to create a commerce pipeline processor. So, I started googling and found this great tutorial by Oracle: Creating processors. Well, that’s not bad. But it’s a bit poor. For example, I’d like to know, what is the Object pParam? What is the PipelineResult pResult? Actually, there’s the third (unanswered) question: why all parameters starts with “p”?
Ok, I’ve found some answers to my questions. Let’s see.