Showing posts with label Liferay Elastic Search. Show all posts
Showing posts with label Liferay Elastic Search. Show all posts

Sunday, July 18, 2021

Liferay Portal Logs Monitoring with ELKK

This article demonstrates the Liferay Portal Logs monitoring using ELKK stack.


 

Elastic Search

Logstash

Kibana

Kafka

 

 

Previous Article We already implemented Liferay Centralized logging system using Kafka.


Now we will use Logstash to pull the Liferay Logs from Kafka and push to Elastic Search.


Kibana will be used as Visualization and Monitoring tool to analyze the Liferay portal logs.


Following is Architecture diagram

 



Software’s and Tools


 

Windows 10

Java 1.8 or higher

Apache24

Liferay 7.4

Zookeeper-3.7.0

Kafka-2.8.0

Logstash-7.13.3

Kibana-7.13.3

Elasticsearch-7.13.3

 

 

 

Prerequisite


Implement Liferay Centralized logging system from below article which covers the Zookeeper, Kafka and Liferay Installation.

 

http://www.liferaysavvy.com/2021/07/centralized-logging-for-liferay-portal.html

 



It’s time to install and configure ELK stack.

 

  • Install Elastic Search Cluster
  • Install Logstash and configure input/output pipeline and Start
  • Validate Index creation in Elastic Search
  • Install Kibana and Start
  • Create Index Pattern in Kibana and Analyze Liferay portal logs.

 


Install Elastic Search Cluster


Follow the below article to install Elastic Search Cluster


http://www.liferaysavvy.com/2021/07/install-elastic-search-cluster.html



Install Logstash and configure input/output pipeline and Start


Follow the blow article to install log stash


http://www.liferaysavvy.com/2021/07/logstash-installation.html


Above Logstash install with dummy input/output and now it’s time to define actual Logstash pipeline. It’s very important step.


Locate to Logstash config location and create “logstash.conf” file.







 

We have all logs in Kafka now we will define Logstash pipeline input to Kafka and output to elastic search.


Use following configuration in “logstash.conf” file


 

input { 

    kafka {

        bootstrap_servers => "localhost:9092,localhost:9093,localhost:9094"

        topics => ["liferay-kafka-logs"]

    }

}

 

output { 

    elasticsearch {

        hosts => ["localhost:9200","localhost:9201","localhost:9202"]

        index => "liferay-index"

    }

}

 

 



 


Input should have Kafka bootstrap servers and Kafka topics. All logs are sending on “liferay-kafka-logs” topic and same topic was used in Liferay Log4J configuration for Kafka Appender.


Output is Elastic search cluster instances that we already installed. We also need to provide index name so that all logs will be tagged with given index.


Open command prompt and locate to Logstash root directory and use following command to start Logstash.


 

bin\logstash.bat -f config\logstash.conf

 

 






Once Logstash started successfully, all logs are collecting from Kafka and push to Elastic Search.



Install Kibana and Start


Follow the below Article to Install Kibana

 

http://www.liferaysavvy.com/2021/07/kibana-installation.html


 

Validate Index creation in Elastic Search


Make sure Logstash given index in configuration (logstash.conf) should be present in the Elastic search index list.


We can use any one of the Elastic cluster Node to confirm the elastic search health and index details. All should be green in the output.


Use below URL

 

http://localhost:9200/_cat/indices?v

 

 

 



Make sure all Stack Started and following is order. If anything, missed, star/restart in the order.

 

 

Start Zookeeper Cluster

Start Kafka Cluster

Start Liferay Portal Cluster

Start Elastic Cluster

Start Logstash
Start Kibana

 

 

Example screen shows all services started in local machine.






ELKK Important Information

 


 

Zookeeper Cluster

 

localhost:2181

localhost:2182

localhost:2183

 

 

Kafka Cluster

 

localhost:9092

localhost:9093

localhost:9094

 

 

Liferay Portal Cluster

 

http://localhost/

 

 

Elastic Cluster

 

http://localhost:9200/

http://localhost:9201/

http://localhost:9202/

 

 

Logstash

 

 

http://localhost:9600/

 

 

Kibana

 

http://localhost:5601/

 

 

Kafka Topic

 

liferay-kafka-logs

 

 

Elastic Search Index

 

liferay-index

 

 


Define Index Pattern in Kibana


To Monitor logs in Kibana we need to create index pattern in Kibana.


Go to Kibana home page and click on Left side toggle panel and Click on “Stack Management” and add Kibana Index pattern.


 



 

 

Click on Kibana à Index Pattern à Create Index Pattern

 



 

Provide the index name which we were provided in the Logstash file. You can provide exact index name or use wildcard pattern (liferay-*).

 



 

 

Select time field and create index pattern




 

Go to Analytics à Discovery. We can see index in the list.

 



 

 

 


Select newly created index and all the logs’ data visible in the page. Change the time frame to play with logs data.

 





Author

Monday, July 12, 2021

Install Elastic Search Cluster

Elastic search is open-source distributed search and analytics engine based on Lucene search engine. It’s completely Restful implementation and easy to use.


Elastic search is core of Elastic stack and there are many products from elastic stack.


Example demonstrating 3 nodes elastic search cluster.

 



 

Software’s and Tools



 

Windows 10

Java 1.8 or higher

Elasticsearch-7.13.3

 

 

Download and Extract



Go to elastic search download page and click on below links to download “elasticsearch-7.13.3” to your local machine.


https://www.elastic.co/downloads/elasticsearch

 

Direct download link


https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-7.13.3-windows-x86_64.zip


Extract downloaded elastic search zip file to local drive.




 

Elastic Search Cluster



Elastic Search Node1



Open command prompt and locate elastic search bin directory and use below start command to start elastic search Node1 in the cluster.

 

Need to provide Cluster Name, Node Name, Data and Logs path as parameters.

 

 

elasticsearch.bat -Ecluster.name=elastic-search-cluster -Enode.name=node1  -Epath.data=data1 -Epath.logs=log1

 


 



 




We can see logs in the console which confirms the elastic search startup and its ports numbers.

 




We just started first node it elected as Master Node and It uses the 9300 port for discovery in the cluster. Rest services can be accessed on 9200 port.



Elastic Search Node2



Open second command prompt and locate to elastic search bin directory and use above start command.

 

 

elasticsearch.bat -Ecluster.name=elastic-search-cluster -Enode.name=node2  -Epath.data=data2 -Epath.logs=log2

 


 



We can see logs in the console which confirms the elastic search startup and its ports numbers.

 




Node2 uses the 9301 port for discovery port in the cluster. Master node will identify the node2 and it will join in the cluster. Rest services can be accessed on 9201 port.

 

 

Node 1 Master node we can see the information that Node2 joined in the cluster.

 



 

Elastic Search Node2



Open 3rd command prompt and use below start command to stat elastic search Node3

 

 

elasticsearch.bat -Ecluster.name=elastic-search-cluster -Enode.name=node3  -Epath.data=data3 -Epath.logs=log3

 

 



Startup logs confirms the Node3 startup.




Node3 uses the 9302 port for discovery port in the cluster. Master node will identify the node3 and it will join in the cluster. Rest services can be accessed on 9202 port.


Node 1 Master node we can see the information that Node3 joined in the cluster.




 

Cluster Information



Nodes

Discovery Port

Rest Access

Node1(Master)

127.0.0.1:9300

127.0.0.1:9200

Node2

127.0.0.1:9301

127.0.0.1:9201

Node3

127.0.0.1:9302

127.0.0.1:9202

 


Node1 Rest Access


 

 

http://localhost:9200/

 

 

Access above URL and it will return JSON data which contains the node details

 



Node2 Rest Access




http://localhost:9201/


 

Access above URL and it will return JSON data which contains the node details





Node2 Rest Access




http://localhost:9202/




Access above URL and it will return JSON data which contains the node details

 




Check Cluster Health


We can use any one of below URL to check elastic cluster health.


 

http://localhost:9200/_cat/health?v

 

http://localhost:9201/_cat/health?v

 

http://localhost:9202/_cat/health?v

 

 



Now we have successfully completed setup elastic cluster. We can see node specific Data and Logs directories in the rood directory of Elastic search.

 



 

Note


We can use any node Rest access to do elastic operations like create document and search. Follow elastic search quick start guide to know more about operations.

 

References

 

https://www.elastic.co/guide/en/elasticsearch/reference/current/getting-started.html

 


Author

 

Recent Posts

Recent Posts Widget

Popular Posts