This article demonstrates the Liferay Portal Logs monitoring using ELKK stack.
Elastic Search Logstash Kibana Kafka |
Previous Article We already implemented Liferay
Centralized logging system using Kafka.
Now we will use Logstash to pull the Liferay
Logs from Kafka and push to Elastic Search.
Kibana will be used as Visualization and Monitoring tool to analyze the Liferay portal logs.
Following is Architecture diagram
Software’s and Tools
Windows 10 Java 1.8 or
higher Apache24 Liferay 7.4 Zookeeper-3.7.0 Kafka-2.8.0 Logstash-7.13.3 Kibana-7.13.3 Elasticsearch-7.13.3
|
Prerequisite
Implement Liferay Centralized logging system from
below article which covers the Zookeeper, Kafka and Liferay Installation.
http://www.liferaysavvy.com/2021/07/centralized-logging-for-liferay-portal.html
It’s time to install and configure ELK stack.
- Install Elastic Search Cluster
- Install Logstash and configure input/output pipeline and Start
- Validate Index creation in Elastic Search
- Install Kibana and Start
- Create Index Pattern in Kibana and Analyze Liferay portal logs.
Install Elastic Search Cluster
Follow the below article to install Elastic Search
Cluster
http://www.liferaysavvy.com/2021/07/install-elastic-search-cluster.html
Install Logstash and configure input/output pipeline and Start
Follow the blow article to install log stash
http://www.liferaysavvy.com/2021/07/logstash-installation.html
Above Logstash install with dummy input/output and now it’s time to define actual Logstash pipeline. It’s very important step.
Locate to Logstash config location and create “logstash.conf”
file.
We have all logs in Kafka now we will define Logstash
pipeline input to Kafka and output to elastic search.
Use following configuration in “logstash.conf” file
input { kafka { bootstrap_servers => "localhost:9092,localhost:9093,localhost:9094" topics => ["liferay-kafka-logs"] } } output { elasticsearch { hosts => ["localhost:9200","localhost:9201","localhost:9202"] index => "liferay-index" } } |
Input should have Kafka bootstrap servers and Kafka topics. All logs are sending on “liferay-kafka-logs” topic and same topic was used in Liferay Log4J configuration for Kafka Appender.
Output is Elastic search cluster instances that we already installed. We also need to provide index name so that all logs will be tagged with given index.
Open command prompt and locate to Logstash root directory and use following command to start Logstash.
bin\logstash.bat
-f config\logstash.conf |
Once Logstash started successfully, all logs are collecting from Kafka and push to Elastic Search.
Install Kibana and Start
Follow the below Article to Install Kibana
http://www.liferaysavvy.com/2021/07/kibana-installation.html
Validate Index creation in Elastic Search
Make sure Logstash given index in configuration (logstash.conf)
should be present in the Elastic search index list.
We can use any one of the Elastic cluster Node to
confirm the elastic search health and index details. All should be green in the
output.
Use below URL
http://localhost:9200/_cat/indices?v |
Make sure all Stack Started and following is order. If
anything, missed, star/restart in the order.
Start Zookeeper
Cluster Start Kafka
Cluster Start Liferay
Portal Cluster Start Elastic
Cluster Start Logstash |
Example screen shows all services started in local machine.
ELKK Important Information
Zookeeper
Cluster |
localhost:2181 localhost:2182 localhost:2183 |
Kafka Cluster |
localhost:9092 localhost:9093 localhost:9094 |
Liferay Portal
Cluster |
|
Elastic Cluster |
|
Logstash |
|
Kibana |
|
Kafka Topic |
liferay-kafka-logs |
Elastic Search
Index |
liferay-index |
Define Index Pattern in Kibana
To Monitor logs in Kibana we need to create index
pattern in Kibana.
Go to Kibana home page and click on Left side toggle
panel and Click on “Stack Management” and add Kibana Index pattern.
Click on Kibana Ã
Index Pattern à Create Index Pattern
Provide the index name
which we were provided in the Logstash file. You can provide exact index name
or use wildcard pattern (liferay-*).
Select time field and create index pattern
Go to Analytics à Discovery.
We can see index in the list.
Select newly created index and all the logs’ data
visible in the page. Change the time frame to play with logs data.
Prepaid data SIM cards are different. They don’t work with your regular phone. Instead, you have to buy a prepaid plan and load it onto the card. This way, you can use the card for phone calls, text messages, and data. You can also use it for online shopping and other activities. Data SIM Card
ReplyDelete