Showing posts with label liferay. Show all posts
Showing posts with label liferay. Show all posts

Tuesday, July 20, 2021

Liferay Tomcat Access Logs to Kafka

Tomcat access logs to keep the record of all requests processed by the application which are deployed in tomcat server. It will log every request and its response status. We can build many reports based on access logs.


Default Tomcat Access logs will be writing all logs into file when we enable it in server.xml file.



<Valve className="org.apache.catalina.valves.AccessLogValve" directory="logs" -->

               prefix="localhost_access_log" suffix=".txt"

                pattern="%h %l %u %t &quot;%r&quot; %s %b" />




Assume that if we wanted to maintain all logs in centralized location that is Kafka. All application logs can be sent to Kafka using log4j Kafka Appender but access logs are different. We will use Kafka client API to send tomcat access logs to Kafka.


We will use Tomcat Valve and Access log API to implement our custom Valve where we will implement logic to send access logs to Kafka.

 

  • Create Kafka Topic
  • Create Custom Tomcat Access Logs Valve
  • Deploy Custom Tomcat Access Logs Valve
  • Configure Custom Access Logs Valve in server.xml
  • Validate Implementation

 




 

 

Prerequisite


Setup Zookeeper Cluster


http://www.liferaysavvy.com/2021/07/setup-zookeeper-cluster.html



Setup Kafka Cluster


http://www.liferaysavvy.com/2021/07/setup-kafka-cluster.html



Install Liferay Cluster


http://www.liferaysavvy.com/2021/07/centralized-logging-for-liferay-portal.html



 

Start Zookeeper Cluster

Start Kafka Cluster


 


 Create Kafka Topic


Open command prompt and locate to one of the Kafka broker bin windows directory. Use following create topic command.



 

kafka-topics.bat --create --zookeeper localhost:2181,localhost:2182,localhost:2183 --replication-factor 3 --partitions 3 --topic liferay-tomcat-access-logs

 

 

We should pass all zookeeper cluster nodes in the options.




 

List topics


Make sure topic successfully created.


 

kafka-topics.bat --zookeeper localhost:2181,localhost:2182,localhost:2183 --list

 

 




Create Custom Tomcat Access Logs Valve


Create Custom Access valve is very simple we just need to override the log(--) method from “AbstractAccessLogValve” We will use kafka clients to send message to Kafka.


KafkaAccessLogValve.java



package com.liferaysavvy.kafka.accesslog;

 

import com.liferaysavvy.kafka.accesslog.constants.KafkaConstants;

import com.liferaysavvy.kafka.accesslog.producer.KafkaMessageSender;

import org.apache.catalina.valves.AbstractAccessLogValve;

import org.apache.juli.logging.Log;

import org.apache.juli.logging.LogFactory;

 

import java.io.CharArrayWriter;

 

public class KafkaAccessLogValve extends AbstractAccessLogValve {

    private static final Log log = LogFactory.getLog(KafkaAccessLogValve.class);

    @Override

    public void log(CharArrayWriter message) {

        try {

 

            new Thread(() -> new KafkaMessageSender().sendMessage(message.toString())).start();

           /* Thread thread = new Thread(){;

                public void run(){

                    System.out.println("Thread Running");

                }

            };

            thread.start();*/

        } catch (Exception e) {

            log.error("Access logs are not sending to Kafka",e);

        }

 

    }

}


 


KafkaMessageSender.java



package com.liferaysavvy.kafka.accesslog.producer;

 

import com.liferaysavvy.kafka.accesslog.config.KafkaConfig;

import com.liferaysavvy.kafka.accesslog.constants.KafkaConstants;

import org.apache.kafka.clients.producer.Producer;

import org.apache.kafka.clients.producer.ProducerRecord;

 

public class KafkaMessageSender {

    public void sendMessage(String message) {

        final Producer<String, String> kafkaProducer = KafkaConfig.getProducer();

        ProducerRecord<String, String> record = new ProducerRecord<String, String>(KafkaConstants.TOPIC, message);

        kafkaProducer.send(record);

        kafkaProducer.flush();

        kafkaProducer.close();

    }

}


 


KafkaConstants.java



package com.liferaysavvy.kafka.accesslog.constants;

public final class KafkaConstants {

    private KafkaConstants(){}

    public static final String TOPIC = "liferay-tomcat-access-logs";

    // Kafka Brokers

    public static final String BOOTSTRAP_SERVERS = "localhost:9092, localhost:9093, localhost:9094";

}


 


 KafkaConfig.java



package com.liferaysavvy.kafka.accesslog.config;

import com.liferaysavvy.kafka.accesslog.constants.KafkaConstants;

import org.apache.kafka.clients.producer.KafkaProducer;

import org.apache.kafka.clients.producer.Producer;

import org.apache.kafka.clients.producer.ProducerConfig;

import org.apache.kafka.common.serialization.StringSerializer;

 

import java.util.Properties;

 

public final class KafkaConfig {

 

    private KafkaConfig() {}

 

    public static Producer<String, String> getProducer() {

        Properties properties = new Properties();

        properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, KafkaConstants.BOOTSTRAP_SERVERS);

        properties.put(ProducerConfig.CLIENT_ID_CONFIG, "TomcatKafkaAccessLog");

        properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());

        properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());

        return new KafkaProducer<>(properties);

    }

}


 


Deploy Custom Tomcat Access Logs Valve


Get source code from below link and build the maven project. It will generate jar artifact. Copy generated JAR file to tomcat/lib directory.


https://github.com/LiferaySavvy/tomcat-accesslog-kafka-producer


 

mvn clean install

 



Deploy jar file in every tomcat in the cluster.

 

Liferay-Node1




 

Liferay-Node2




 


Configure Custom Access Logs Valve in “server.xml”


Locate to tomcat conf directory and update server.xml file with custom valve configuration. Repeat the same for every node in the Liferay cluster.


 

<Valve className="com.liferaysavvy.kafka.accesslog.KafkaAccessLogValve" pattern="%h %l %u %t &quot;%r&quot; %s %b" />


 



 

Validate Implementation


 

Start Liferay Cluster


 

Start Kafka Consumer on “liferay-tomcat-access-logs


Open command prompt and locate to Kafka bin windows directory. Use following consumer command to start consumer.


 

kafka-console-consumer.bat --bootstrap-server localhost:9092,localhost:9093,localhost:9094 --topic liferay-tomcat-access-logs --from-beginning

 

 





We can see Liferay tomcat access logs in Kafka Consumer.





Use Kibana for logs monitoring, analyze and build dashboards for Access Logs. Need to configure Kafka topic in Logstash input so that logos will be available for Kibana.


Follow the below article to use Kibana for Logs monitoring.

 

http://www.liferaysavvy.com/2021/07/liferay-portal-logs-monitoring-with-elkk.html

 


Author

Sunday, March 22, 2020

Google Custom Search Liferay 7/DXP Module


Google provides the custom search API and we can leverage to provide site search capabilities in websites.

I was playing around Google custom search API and I have created sample Liferay Module. I am using custom search API capability in liferaysavvy.com

Google already have many search elements in the form of html elements and JavaScript. We can simply use these search elements to provide website search.

The following are basic steps to create google custom search

  1. Create Google Custom Search Engine Context
  2. Use Google Custom Search API

Create Google Custom Search Engine Context

You mush have google account and access below link to create search context to your website.



Click on Add button and provide site URL and Site name then click on create button. New site context will be created.



Now you can see following screen and form there you can get code snippets or go to control panel


Click on control panel then you can find search context details. Search engine ID is important value which we will use in API.



Now click on “Look and Feel” Tab and select “Full Width” layout and then save. Google Custom Search offers different layouts to display search results.




We have created search engine context for the site and we will use it in the Custom Search API.

Use Google Custom Search API.

Google have developer’s API and code snippets to use it in development of custom search for websites.

Here is Developer API page to get more details.


I have created following JavaScript which render result in html div

Example of Standalone HTML


<!DOCTYPE html>
<html>
<head>
<script async src="https://cse.google.com/cse.js?cx=016080267653024587377:dlk531h1uoe"></script>
</head>
<body>
<button type="button" onclick="searchSite();">Search</button>

<button  type='button' value="">
<div id="google-search-results"></div>
<script>


 function searchSite() {

       document.getElementById("google-search-results").innerHTML = "";
               google.search.cse.element.render({
            div: "google-search-results",
            tag: 'searchresults-only',
            gname: 'google-results-gname'
        });
            var element = google.search.cse.element.getElement('google-results-gname');
               var query = "liferay";
        element.execute(query);            
      }
</script>
</body>
</html>


Make sure in the script cx value should be your “Search engine ID” that we created earlier.

Liferay Google Custom Search Widget

We can use same custom search API in Liferay widgets and following are the steps to implement google custom search in Liferay OSGi Modules.
  1. Create Liferay MVC OSGi Module
  2. Use Custom Search API in JSPs
  3. Build and Deploy Module
  4. Add Widget to page and validate changes

Module GitHub Source

Get project source code from following location.

Gradle


Maven


Create Liferay MVC OSGi Module

We have different ways to create Liferay OSGi modules and we need to create mvc-portlet Liferay OSGi module. While creating Liferay OSGi module need to select mvc-portlet project template. It will create basic rest module.


Use Custom Search API in JSPs

Use following code snippet in JSP page and change Search engine ID as per your site context ID in the script tag.


<%@ include file="/init.jsp" %>
<div class="input-group search-bar-scope">
       <div class="input-group-item search-bar-keywords-input-wrapper">
          <input class="form-control input-group-inset input-group-inset-after search-bar-keywords-input" id="search-input"  placeholder="search" title="search" type="text" value="LiferaySavvy" />
             <div class="input-group-inset-item input-group-inset-item-after search-bar-search-button-wrapper">
                    <clay:button ariaLabel='asasd' elementClasses="search-bar-search-button" icon="search" style="unstyled" id="lsSearch" type="submit"/>
             </div>
       </div>
</div>
<div id="google-search-results"></div>
<portlet:renderURL var="homeURL">
</portlet:renderURL>
<br/>
<br/>
<clay:link
       href="<%= homeURL %>"
       label="Home"
/>
<script async src="https://cse.google.com/cse.js?cx=016080267653024587377:dlk531h1uoe"></script>
<script>
document.getElementById("lsSearch").addEventListener("click", function(event){
  event.preventDefault();
  searchSite();
});

function searchSite() {

     document.getElementById("google-search-results").innerHTML = "";
    
          google.search.cse.element.render({
         div: "google-search-results",
         tag: 'searchresults-only',
         gname: 'google-results-gname'
     });
          var element = google.search.cse.element.getElement('google-results-gname');
          var query = document.getElementById("search-input").value
     element.execute(query);          
   }
</script>


Build and Deploy Module

Locate to project and use required commands to build and deploy module.

Module Build

Maven Build


mvn clean install

OR

mvnw clean install



Gradle Build


gradlew build

OR

gradle build


Module Deployment

If bundle support plugin already added in project build file then it will deploy to Liferay Module Framework with following commands

Gradle deploy

Add following bundle support plugin in module “build.gradle” file


liferay {
    liferayHome = "C:/Liferay/Liferay7.2/liferay-workspace/bundles"
    deployDir = file("${liferayHome}/osgi/modules")
}


Deploy module command


gradlew deploy

OR

gradle deploy


Maven Deployment

Add bundle support plugin in module “pom.xml” file and update liferay home path.


<plugin>
<groupId>com.liferay</groupId>
<artifactId>com.liferay.portal.tools.bundle.support</artifactId>
<version>3.5.4</version>
<executions>
<execution>
<id>deploy</id>
<goals>
<goal>deploy</goal>
</goals>
<phase>pre-integration-test</phase>
</execution>
</executions>
<configuration>
<liferayHome>C:/Liferay/Liferay7.2/liferay-workspace/bundles</liferayHome>
</configuration>
</plugin>


Run following maven goals to deploy module


mvn bundle-support:deploy

OR

mvnw bundle-support:deploy


Make sure module jar should be in osgi/module directory and it should be in active state in Liferay Module framework. Use Gogo shell commands to check module status

Module status


Module is available in “osgi/modules”



Add Widget to page and validate changes

Widget will be available in sample category and you can add to page and validate changes.
Home Screen


Custom Search Screen


Google Default Search Screen



Author

Recent Posts

Recent Posts Widget

Popular Posts