Sunday, July 11, 2021

Centralized Logging for Liferay Portal

Kafka is distributed streaming system based on “publish and subscribe to” model. We can use Kafka for centralized logging system for applications.


This article demonstrate the implementation of  Kafka centralized logging system for Liferay Portal. This will be used in Liferay cluster environments to check all Liferay Nodes logs at one place.

 

Prerequisite.


Kafka Cluster with 3 Nodes

Liferay Cluster with 2 Nodes

 

Software’s and Tools



 

Windows 10

Kafka 2.8

Java 1.8 or higher

Zookeeper 3.5.9 (Embedded in Kafka)

Liferay 7.4 CE

 

 

 

Steps to implement Centralized Logging for Liferay


Setup a Kafka Cluster with 3 Brokers

Setup a Liferay Cluster with 2 Nodes

Configure Kafka Appender

Add required JAR’s

Add Liferay Node system property

Start Kafka Cluster

Create Kafka Topic

Start Kafka consumer

Start Liferay Cluster

View Liferay Cluster logs in Kafka Consumer

 

Architecture Diagram




 

Setup a Kafka Cluster with 3 Brokers


This demonstration required Kafka cluster and follow below article to setup a Kafka cluster on windows.


http://www.liferaysavvy.com/2021/07/setup-kafka-cluster.html


 

Setup a Liferay Cluster with 2 Nodes


We are setting up centralized logging system for Liferay portal so we needed Liferay Cluster up and running.


Follow below article to setup Liferay Cluster


http://www.liferaysavvy.com/2021/07/liferay-portal-apache-webserver.html

 


Configure Kafka Appender



Need to configure Kafka Appender in portal log4j configuration. If any customization to Liferay portal log4j required to update configuration in “portal-log4j-ext.xml


Create META-INF directory in each Liferay Portal instance tomcat lib directory




 

Create “portal-log4j-ext.xml” file in META-INF directory.





Add Kafka Appender configuration in addition to existing Liferay Portal Log4J configuration.


Following is Kafka Appender configuration

 


<Kafka name="Kafka" topic="liferay-kafka-logs">

            <PatternLayout pattern="${sys:liferay.node} %d{yyyy-MM-dd HH:mm:ss.SSS} %-5p [%t][%c{1}:%L] %m%n"/>

            <Property name="bootstrap.servers">localhost:9092,localhost:9093,localhost:9094</Property>

        </Kafka>

        <Async name="KafkaAsync">

            <AppenderRef ref="Kafka"/>

        </Async>

<Loggers>

                    <Root level="INFO">

                              <AppenderRef ref="KafkaAsync"/>

                   

                    </Root>

          </Loggers>


 

“bootstrap.servers” are the Kafka cluster hosts and its ports and each one should comma separated.


Topic attribute represent Kafka topic name where all the Liferay logs will be sending.


All Liferay Portal cluster logs are following to Kafka on same topic so It is required to differentiate logs by Liferay Node name so "${sys:liferay.node}” configuration will be the Liferay Node name, which is configured as system property. We need to included node system property as part of log pattern.

 

Use below configuration in “portal-log4j-ext.xml” which included Liferay log4j and Kafka Appender configuration.

 


 

<?xml version="1.0"?>

 

<Configuration strict="true">

          <Appenders>

                    <Appender name="CONSOLE" type="Console">

                              <Layout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} %-5p [%t][%c{1}:%L] %m%n" type="PatternLayout" />

                    </Appender>

 

                    <Appender filePattern="@liferay.home@/logs/liferay.%d{yyyy-MM-dd}.log" ignoreExceptions="false" name="TEXT_FILE" type="RollingFile">

                              <Layout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} %-5p [%t][%c{1}:%L] %m%n" type="PatternLayout" />

 

                              <TimeBasedTriggeringPolicy />

 

                              <DirectWriteRolloverStrategy />

                    </Appender>

 

                    <Appender filePattern="@liferay.home@/logs/liferay.%d{yyyy-MM-dd}.xml" ignoreExceptions="false" name="XML_FILE" type="RollingFile">

                              <Log4j1XmlLayout locationInfo="true" />

 

                              <TimeBasedTriggeringPolicy />

 

                              <DirectWriteRolloverStrategy />

                    </Appender>

                    <Kafka name="Kafka" topic="liferay-kafka-logs">

            <PatternLayout pattern="${sys:liferay.node} %d{yyyy-MM-dd HH:mm:ss.SSS} %-5p [%t][%c{1}:%L] %m%n"/>

            <Property name="bootstrap.servers">localhost:9092,localhost:9093,localhost:9094</Property>

        </Kafka>

        <Async name="KafkaAsync">

            <AppenderRef ref="Kafka"/>

        </Async>

 

        <Console name="stdout" target="SYSTEM_OUT">

            <PatternLayout pattern="%d{HH:mm:ss.SSS} %-5p [%-7t] %F:%L - %m%n"/>

        </Console>

 

          </Appenders>

 

          <Loggers>

                    <Root level="INFO">

                              <AppenderRef ref="KafkaAsync"/>

                              <AppenderRef ref="CONSOLE" />

                              <AppenderRef ref="TEXT_FILE" />

                              <AppenderRef ref="XML_FILE" />

                    </Root>

          </Loggers>

</Configuration>

 

 


Original configuration available in “tomcat/webapps/ROOT/WEB-INF/lib/portal-impl.jar META-INF/portal-log4j.xml”. We took required configuration and updated in “portal-log4j-ext.xml


https://github.com/liferay/liferay-portal/blob/7.4.x/portal-impl/src/META-INF/portal-log4j.xml

 


Add required JAR’s



Implementation is required few jars files which should be added in tomcat global class path.


The following are required jars


 

kafka-clients-2.8.0.jar,

slf4j-api.jar,

kafka-log4j-appender-2.8.0.jar

 

 

kafka-clients-2.8.0.jar have all Kafka API to interact with Kafka brokers like Producers and Consumers.


kafka-clients-2.8.0.jar is available in Kafka server lib directory.


 



Copy same jar from Kafka server lib to tomcat lib directory.




 

kafka-log4j-appender-2.8.0.jar have log4j Appender implementation and its internally using “kafka-clients”. It had Kafka producers to send logs to Kafka topic.

 

Download “kafka-log4j-appender” from maven central repository and add it in tomcat lib directory.


https://repo1.maven.org/maven2/org/apache/kafka/kafka-log4j-appender/2.8.0/kafka-log4j-appender-2.8.0.jar





 

 

slf4j-api.jar required by Kafka Appender so it should be available in portal tomcat lib directory.


slf4j-api.jar is available in Liferay Portal ROOT/WEB-INF/lib directory. Copy/Move slf4j-api.jar file to tomcat global lib directory.




 

Note:


Above all configuration should be updated in each Liferay Node in the cluster.

 


Add Liferay Node system property


 

Need to add Liferay Node system property in Liferay tomcat setenv.bat file



Liferay Node1


Locate to Liferay Node1 tomcat bin directory and open setenv.bat file in editor and new Liferay node system variable to existing list.


 

-Dliferay.node=Liferay-Node1

 

 

 

set "CATALINA_OPTS=%CATALINA_OPTS% -Dfile.encoding=UTF-8 -Djava.locale.providers=JRE,COMPAT,CLDR -Djava.net.preferIPv4Stack=true -Duser.timezone=GMT -Xms2560m -Xmx2560m -XX:MaxNewSize=1536m -XX:MaxMetaspaceSize=768m -XX:MetaspaceSize=768m -XX:NewSize=1536m -XX:SurvivorRatio=7 -Dliferay.node=Liferay-Node1"

 

 






Liferay Node2


Locate to Liferay Node2 tomcat bin directory and open setenv.bat file in editor and new Liferay node system variable to existing list.

 

 

-Dliferay.node=Liferay-Node2

 

 

 

set "CATALINA_OPTS=%CATALINA_OPTS% -Dfile.encoding=UTF-8 -Djava.locale.providers=JRE,COMPAT,CLDR -Djava.net.preferIPv4Stack=true -Duser.timezone=GMT -Xms2560m -Xmx2560m -XX:MaxNewSize=1536m -XX:MaxMetaspaceSize=768m -XX:MetaspaceSize=768m -XX:NewSize=1536m -XX:SurvivorRatio=7 -Dliferay.node=Liferay-Node2"

 

 



 

Start Kafka Cluster


We already setup Kafka cluster and follow the same article to start Kafka Cluster


http://www.liferaysavvy.com/2021/07/setup-kafka-cluster.html

 


Create Kafka Topic



Open command prompt and locate to one of the Kafka broker bin windows directory. Use following create topic command.


Topic: liferay-kafka-logs


Same topic we have configured in the Kafka appended log4j configuration.


 

kafka-topics.bat --create --zookeeper localhost:2181,localhost:2182,localhost:2183 --replication-factor 3 --partitions 3 --topic liferay-kafka-logs

 

 

We should pass all zookeeper cluster nodes in the options.


Make sure topic is created by using list command.


 

kafka-topics.bat --zookeeper localhost:2181,localhost:2182,localhost:2183 –list

 

 

 



 

Start Kafka consumer



Open command prompt and locate to Kafka bin windows directory. Use following consumer command to start consumer. We have to start consumer for “liferay-kafka-logs

 

 

kafka-console-consumer.bat --bootstrap-server localhost:9092,localhost:9093,localhost:9094 --topic liferay-kafka-logs --from-beginning

 

 





Start Liferay Cluster



Now start Each Liferay node in the cluster. We already setup the Liferay cluster and follow the same article to start Liferay Nodes.


http://www.liferaysavvy.com/2021/07/liferay-portal-apache-webserver.html

 


View Liferay Cluster logs in Kafka Consumer


Observe the Kafka consumer console window which is flowing with Liferay both nodes’ logs.


These logs will be differentiated by Liferay Node names in the cluster that we already configured as system property and same used as part Kafka Appender configuration.

 






It confirms that successfully implemented centralized logging system with Kafka Log4J Appender.



Advantages



This implementation will avoid to store logs in each server such way a logs management will become very easy and all logs available in central location.


This will resolve storage issues in the serves.


Integrate Splunk or Kibana Web UI connect to Kafka to monitor applications in efficient manner.



Notes


Demonstration purpose we implemented all the clusters in single machine. Real production has multiple servers to manage Kafka and Liferay clusters.


Due to single machine, we might have to change port numbers for Liferay and Kafka cluster but production environment that is not necessary.


Demonstration purpose we have used Kafka consumer to show logs but real-world environment we have to use Web UI tools to monitor logs like Splunk or Kibana.


We have not changed existing Liferay portal Appenders so logs will be stored in each node logs directory in files. To avoid duplicate storage of logs, we can remove other Appender so that logs will not be stored in local logs directory.


References


https://logging.apache.org/log4j/2.x/manual/appenders.html

 

 


Author

 

 

 

 

 

0 comments :

Post a Comment

Recent Posts

Recent Posts Widget

Popular Posts