Tuesday, July 20, 2021

Liferay Tomcat Access Logs to Kafka

Tomcat access logs to keep the record of all requests processed by the application which are deployed in tomcat server. It will log every request and its response status. We can build many reports based on access logs.


Default Tomcat Access logs will be writing all logs into file when we enable it in server.xml file.



<Valve className="org.apache.catalina.valves.AccessLogValve" directory="logs" -->

               prefix="localhost_access_log" suffix=".txt"

                pattern="%h %l %u %t &quot;%r&quot; %s %b" />




Assume that if we wanted to maintain all logs in centralized location that is Kafka. All application logs can be sent to Kafka using log4j Kafka Appender but access logs are different. We will use Kafka client API to send tomcat access logs to Kafka.


We will use Tomcat Valve and Access log API to implement our custom Valve where we will implement logic to send access logs to Kafka.

 

  • Create Kafka Topic
  • Create Custom Tomcat Access Logs Valve
  • Deploy Custom Tomcat Access Logs Valve
  • Configure Custom Access Logs Valve in server.xml
  • Validate Implementation

 




 

 

Prerequisite


Setup Zookeeper Cluster


http://www.liferaysavvy.com/2021/07/setup-zookeeper-cluster.html



Setup Kafka Cluster


http://www.liferaysavvy.com/2021/07/setup-kafka-cluster.html



Install Liferay Cluster


http://www.liferaysavvy.com/2021/07/centralized-logging-for-liferay-portal.html



 

Start Zookeeper Cluster

Start Kafka Cluster


 


 Create Kafka Topic


Open command prompt and locate to one of the Kafka broker bin windows directory. Use following create topic command.



 

kafka-topics.bat --create --zookeeper localhost:2181,localhost:2182,localhost:2183 --replication-factor 3 --partitions 3 --topic liferay-tomcat-access-logs

 

 

We should pass all zookeeper cluster nodes in the options.




 

List topics


Make sure topic successfully created.


 

kafka-topics.bat --zookeeper localhost:2181,localhost:2182,localhost:2183 --list

 

 




Create Custom Tomcat Access Logs Valve


Create Custom Access valve is very simple we just need to override the log(--) method from “AbstractAccessLogValve” We will use kafka clients to send message to Kafka.


KafkaAccessLogValve.java



package com.liferaysavvy.kafka.accesslog;

 

import com.liferaysavvy.kafka.accesslog.constants.KafkaConstants;

import com.liferaysavvy.kafka.accesslog.producer.KafkaMessageSender;

import org.apache.catalina.valves.AbstractAccessLogValve;

import org.apache.juli.logging.Log;

import org.apache.juli.logging.LogFactory;

 

import java.io.CharArrayWriter;

 

public class KafkaAccessLogValve extends AbstractAccessLogValve {

    private static final Log log = LogFactory.getLog(KafkaAccessLogValve.class);

    @Override

    public void log(CharArrayWriter message) {

        try {

 

            new Thread(() -> new KafkaMessageSender().sendMessage(message.toString())).start();

           /* Thread thread = new Thread(){;

                public void run(){

                    System.out.println("Thread Running");

                }

            };

            thread.start();*/

        } catch (Exception e) {

            log.error("Access logs are not sending to Kafka",e);

        }

 

    }

}


 


KafkaMessageSender.java



package com.liferaysavvy.kafka.accesslog.producer;

 

import com.liferaysavvy.kafka.accesslog.config.KafkaConfig;

import com.liferaysavvy.kafka.accesslog.constants.KafkaConstants;

import org.apache.kafka.clients.producer.Producer;

import org.apache.kafka.clients.producer.ProducerRecord;

 

public class KafkaMessageSender {

    public void sendMessage(String message) {

        final Producer<String, String> kafkaProducer = KafkaConfig.getProducer();

        ProducerRecord<String, String> record = new ProducerRecord<String, String>(KafkaConstants.TOPIC, message);

        kafkaProducer.send(record);

        kafkaProducer.flush();

        kafkaProducer.close();

    }

}


 


KafkaConstants.java



package com.liferaysavvy.kafka.accesslog.constants;

public final class KafkaConstants {

    private KafkaConstants(){}

    public static final String TOPIC = "liferay-tomcat-access-logs";

    // Kafka Brokers

    public static final String BOOTSTRAP_SERVERS = "localhost:9092, localhost:9093, localhost:9094";

}


 


 KafkaConfig.java



package com.liferaysavvy.kafka.accesslog.config;

import com.liferaysavvy.kafka.accesslog.constants.KafkaConstants;

import org.apache.kafka.clients.producer.KafkaProducer;

import org.apache.kafka.clients.producer.Producer;

import org.apache.kafka.clients.producer.ProducerConfig;

import org.apache.kafka.common.serialization.StringSerializer;

 

import java.util.Properties;

 

public final class KafkaConfig {

 

    private KafkaConfig() {}

 

    public static Producer<String, String> getProducer() {

        Properties properties = new Properties();

        properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, KafkaConstants.BOOTSTRAP_SERVERS);

        properties.put(ProducerConfig.CLIENT_ID_CONFIG, "TomcatKafkaAccessLog");

        properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());

        properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());

        return new KafkaProducer<>(properties);

    }

}


 


Deploy Custom Tomcat Access Logs Valve


Get source code from below link and build the maven project. It will generate jar artifact. Copy generated JAR file to tomcat/lib directory.


https://github.com/LiferaySavvy/tomcat-accesslog-kafka-producer


 

mvn clean install

 



Deploy jar file in every tomcat in the cluster.

 

Liferay-Node1




 

Liferay-Node2




 


Configure Custom Access Logs Valve in “server.xml”


Locate to tomcat conf directory and update server.xml file with custom valve configuration. Repeat the same for every node in the Liferay cluster.


 

<Valve className="com.liferaysavvy.kafka.accesslog.KafkaAccessLogValve" pattern="%h %l %u %t &quot;%r&quot; %s %b" />


 



 

Validate Implementation


 

Start Liferay Cluster


 

Start Kafka Consumer on “liferay-tomcat-access-logs


Open command prompt and locate to Kafka bin windows directory. Use following consumer command to start consumer.


 

kafka-console-consumer.bat --bootstrap-server localhost:9092,localhost:9093,localhost:9094 --topic liferay-tomcat-access-logs --from-beginning

 

 





We can see Liferay tomcat access logs in Kafka Consumer.





Use Kibana for logs monitoring, analyze and build dashboards for Access Logs. Need to configure Kafka topic in Logstash input so that logos will be available for Kibana.


Follow the below article to use Kibana for Logs monitoring.

 

http://www.liferaysavvy.com/2021/07/liferay-portal-logs-monitoring-with-elkk.html

 


Author

2 comments :


  1. Great post. I was checking continuously this blog and I am impressed!
    Extremely helpful info particularly the last part :) I care for such info
    a lot. I was seeking this particular information for
    a long time. Thank you and good luck.강남오피

    ReplyDelete
  2. Thank you for taking the time to write such a detailed post. Your blog is not only informative, but also extremely creative.

    ecommerce development

    ReplyDelete

Recent Posts

Recent Posts Widget

Popular Posts