Showing posts with label Liferay Kafka. Show all posts
Showing posts with label Liferay Kafka. Show all posts

Tuesday, July 20, 2021

Liferay Tomcat Access Logs to Kafka

Tomcat access logs to keep the record of all requests processed by the application which are deployed in tomcat server. It will log every request and its response status. We can build many reports based on access logs.


Default Tomcat Access logs will be writing all logs into file when we enable it in server.xml file.



<Valve className="org.apache.catalina.valves.AccessLogValve" directory="logs" -->

               prefix="localhost_access_log" suffix=".txt"

                pattern="%h %l %u %t &quot;%r&quot; %s %b" />




Assume that if we wanted to maintain all logs in centralized location that is Kafka. All application logs can be sent to Kafka using log4j Kafka Appender but access logs are different. We will use Kafka client API to send tomcat access logs to Kafka.


We will use Tomcat Valve and Access log API to implement our custom Valve where we will implement logic to send access logs to Kafka.

 

  • Create Kafka Topic
  • Create Custom Tomcat Access Logs Valve
  • Deploy Custom Tomcat Access Logs Valve
  • Configure Custom Access Logs Valve in server.xml
  • Validate Implementation

 




 

 

Prerequisite


Setup Zookeeper Cluster


http://www.liferaysavvy.com/2021/07/setup-zookeeper-cluster.html



Setup Kafka Cluster


http://www.liferaysavvy.com/2021/07/setup-kafka-cluster.html



Install Liferay Cluster


http://www.liferaysavvy.com/2021/07/centralized-logging-for-liferay-portal.html



 

Start Zookeeper Cluster

Start Kafka Cluster


 


 Create Kafka Topic


Open command prompt and locate to one of the Kafka broker bin windows directory. Use following create topic command.



 

kafka-topics.bat --create --zookeeper localhost:2181,localhost:2182,localhost:2183 --replication-factor 3 --partitions 3 --topic liferay-tomcat-access-logs

 

 

We should pass all zookeeper cluster nodes in the options.




 

List topics


Make sure topic successfully created.


 

kafka-topics.bat --zookeeper localhost:2181,localhost:2182,localhost:2183 --list

 

 




Create Custom Tomcat Access Logs Valve


Create Custom Access valve is very simple we just need to override the log(--) method from “AbstractAccessLogValve” We will use kafka clients to send message to Kafka.


KafkaAccessLogValve.java



package com.liferaysavvy.kafka.accesslog;

 

import com.liferaysavvy.kafka.accesslog.constants.KafkaConstants;

import com.liferaysavvy.kafka.accesslog.producer.KafkaMessageSender;

import org.apache.catalina.valves.AbstractAccessLogValve;

import org.apache.juli.logging.Log;

import org.apache.juli.logging.LogFactory;

 

import java.io.CharArrayWriter;

 

public class KafkaAccessLogValve extends AbstractAccessLogValve {

    private static final Log log = LogFactory.getLog(KafkaAccessLogValve.class);

    @Override

    public void log(CharArrayWriter message) {

        try {

 

            new Thread(() -> new KafkaMessageSender().sendMessage(message.toString())).start();

           /* Thread thread = new Thread(){;

                public void run(){

                    System.out.println("Thread Running");

                }

            };

            thread.start();*/

        } catch (Exception e) {

            log.error("Access logs are not sending to Kafka",e);

        }

 

    }

}


 


KafkaMessageSender.java



package com.liferaysavvy.kafka.accesslog.producer;

 

import com.liferaysavvy.kafka.accesslog.config.KafkaConfig;

import com.liferaysavvy.kafka.accesslog.constants.KafkaConstants;

import org.apache.kafka.clients.producer.Producer;

import org.apache.kafka.clients.producer.ProducerRecord;

 

public class KafkaMessageSender {

    public void sendMessage(String message) {

        final Producer<String, String> kafkaProducer = KafkaConfig.getProducer();

        ProducerRecord<String, String> record = new ProducerRecord<String, String>(KafkaConstants.TOPIC, message);

        kafkaProducer.send(record);

        kafkaProducer.flush();

        kafkaProducer.close();

    }

}


 


KafkaConstants.java



package com.liferaysavvy.kafka.accesslog.constants;

public final class KafkaConstants {

    private KafkaConstants(){}

    public static final String TOPIC = "liferay-tomcat-access-logs";

    // Kafka Brokers

    public static final String BOOTSTRAP_SERVERS = "localhost:9092, localhost:9093, localhost:9094";

}


 


 KafkaConfig.java



package com.liferaysavvy.kafka.accesslog.config;

import com.liferaysavvy.kafka.accesslog.constants.KafkaConstants;

import org.apache.kafka.clients.producer.KafkaProducer;

import org.apache.kafka.clients.producer.Producer;

import org.apache.kafka.clients.producer.ProducerConfig;

import org.apache.kafka.common.serialization.StringSerializer;

 

import java.util.Properties;

 

public final class KafkaConfig {

 

    private KafkaConfig() {}

 

    public static Producer<String, String> getProducer() {

        Properties properties = new Properties();

        properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, KafkaConstants.BOOTSTRAP_SERVERS);

        properties.put(ProducerConfig.CLIENT_ID_CONFIG, "TomcatKafkaAccessLog");

        properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());

        properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());

        return new KafkaProducer<>(properties);

    }

}


 


Deploy Custom Tomcat Access Logs Valve


Get source code from below link and build the maven project. It will generate jar artifact. Copy generated JAR file to tomcat/lib directory.


https://github.com/LiferaySavvy/tomcat-accesslog-kafka-producer


 

mvn clean install

 



Deploy jar file in every tomcat in the cluster.

 

Liferay-Node1




 

Liferay-Node2




 


Configure Custom Access Logs Valve in “server.xml”


Locate to tomcat conf directory and update server.xml file with custom valve configuration. Repeat the same for every node in the Liferay cluster.


 

<Valve className="com.liferaysavvy.kafka.accesslog.KafkaAccessLogValve" pattern="%h %l %u %t &quot;%r&quot; %s %b" />


 



 

Validate Implementation


 

Start Liferay Cluster


 

Start Kafka Consumer on “liferay-tomcat-access-logs


Open command prompt and locate to Kafka bin windows directory. Use following consumer command to start consumer.


 

kafka-console-consumer.bat --bootstrap-server localhost:9092,localhost:9093,localhost:9094 --topic liferay-tomcat-access-logs --from-beginning

 

 





We can see Liferay tomcat access logs in Kafka Consumer.





Use Kibana for logs monitoring, analyze and build dashboards for Access Logs. Need to configure Kafka topic in Logstash input so that logos will be available for Kibana.


Follow the below article to use Kibana for Logs monitoring.

 

http://www.liferaysavvy.com/2021/07/liferay-portal-logs-monitoring-with-elkk.html

 


Author

Sunday, July 18, 2021

Liferay Portal Logs Monitoring with ELKK

This article demonstrates the Liferay Portal Logs monitoring using ELKK stack.


 

Elastic Search

Logstash

Kibana

Kafka

 

 

Previous Article We already implemented Liferay Centralized logging system using Kafka.


Now we will use Logstash to pull the Liferay Logs from Kafka and push to Elastic Search.


Kibana will be used as Visualization and Monitoring tool to analyze the Liferay portal logs.


Following is Architecture diagram

 



Software’s and Tools


 

Windows 10

Java 1.8 or higher

Apache24

Liferay 7.4

Zookeeper-3.7.0

Kafka-2.8.0

Logstash-7.13.3

Kibana-7.13.3

Elasticsearch-7.13.3

 

 

 

Prerequisite


Implement Liferay Centralized logging system from below article which covers the Zookeeper, Kafka and Liferay Installation.

 

http://www.liferaysavvy.com/2021/07/centralized-logging-for-liferay-portal.html

 



It’s time to install and configure ELK stack.

 

  • Install Elastic Search Cluster
  • Install Logstash and configure input/output pipeline and Start
  • Validate Index creation in Elastic Search
  • Install Kibana and Start
  • Create Index Pattern in Kibana and Analyze Liferay portal logs.

 


Install Elastic Search Cluster


Follow the below article to install Elastic Search Cluster


http://www.liferaysavvy.com/2021/07/install-elastic-search-cluster.html



Install Logstash and configure input/output pipeline and Start


Follow the blow article to install log stash


http://www.liferaysavvy.com/2021/07/logstash-installation.html


Above Logstash install with dummy input/output and now it’s time to define actual Logstash pipeline. It’s very important step.


Locate to Logstash config location and create “logstash.conf” file.







 

We have all logs in Kafka now we will define Logstash pipeline input to Kafka and output to elastic search.


Use following configuration in “logstash.conf” file


 

input { 

    kafka {

        bootstrap_servers => "localhost:9092,localhost:9093,localhost:9094"

        topics => ["liferay-kafka-logs"]

    }

}

 

output { 

    elasticsearch {

        hosts => ["localhost:9200","localhost:9201","localhost:9202"]

        index => "liferay-index"

    }

}

 

 



 


Input should have Kafka bootstrap servers and Kafka topics. All logs are sending on “liferay-kafka-logs” topic and same topic was used in Liferay Log4J configuration for Kafka Appender.


Output is Elastic search cluster instances that we already installed. We also need to provide index name so that all logs will be tagged with given index.


Open command prompt and locate to Logstash root directory and use following command to start Logstash.


 

bin\logstash.bat -f config\logstash.conf

 

 






Once Logstash started successfully, all logs are collecting from Kafka and push to Elastic Search.



Install Kibana and Start


Follow the below Article to Install Kibana

 

http://www.liferaysavvy.com/2021/07/kibana-installation.html


 

Validate Index creation in Elastic Search


Make sure Logstash given index in configuration (logstash.conf) should be present in the Elastic search index list.


We can use any one of the Elastic cluster Node to confirm the elastic search health and index details. All should be green in the output.


Use below URL

 

http://localhost:9200/_cat/indices?v

 

 

 



Make sure all Stack Started and following is order. If anything, missed, star/restart in the order.

 

 

Start Zookeeper Cluster

Start Kafka Cluster

Start Liferay Portal Cluster

Start Elastic Cluster

Start Logstash
Start Kibana

 

 

Example screen shows all services started in local machine.






ELKK Important Information

 


 

Zookeeper Cluster

 

localhost:2181

localhost:2182

localhost:2183

 

 

Kafka Cluster

 

localhost:9092

localhost:9093

localhost:9094

 

 

Liferay Portal Cluster

 

http://localhost/

 

 

Elastic Cluster

 

http://localhost:9200/

http://localhost:9201/

http://localhost:9202/

 

 

Logstash

 

 

http://localhost:9600/

 

 

Kibana

 

http://localhost:5601/

 

 

Kafka Topic

 

liferay-kafka-logs

 

 

Elastic Search Index

 

liferay-index

 

 


Define Index Pattern in Kibana


To Monitor logs in Kibana we need to create index pattern in Kibana.


Go to Kibana home page and click on Left side toggle panel and Click on “Stack Management” and add Kibana Index pattern.


 



 

 

Click on Kibana à Index Pattern à Create Index Pattern

 



 

Provide the index name which we were provided in the Logstash file. You can provide exact index name or use wildcard pattern (liferay-*).

 



 

 

Select time field and create index pattern




 

Go to Analytics à Discovery. We can see index in the list.

 



 

 

 


Select newly created index and all the logs’ data visible in the page. Change the time frame to play with logs data.

 





Author

Recent Posts

Recent Posts Widget

Popular Posts