While developing an application, we always incorporate a feature in our code that can direct us what went wrong if the application fails to run normally. This feature is nothing but logging. The more work you do with logging, the less work you have to do with fixing the application issue. Typically, we keep all logging information in a text format file, called a log file. This file captures all the details such as startup of a server, activities of all users, names of classes, methods, timestamp, occurred exceptions with stack trace etc. Moreover, it depends on us what all information we require in the log file.
Sometimes, these log files become larger in size and finding the exact issue manually becomes tedious. Here ELK Stack helps us in analyzing our log files at runtime. Hence, we will talk about โHow to monitor Spring Boot Microservices using ELK Stack?โ.
The term โELK Stackโ is becoming more popular day by day. ELK is an acronym of a combination of three tools: Elasticsearch, Logstash and Kibana. Generally, we use all of them to monitor our application. However, each of them has a different purpose that we will discuss in below sections. ELK Stack and Splunk are the worldโs most popular log management platform. Here, we will discuss about ELK Stack. Letโs start discussing our topic โHow to monitor Spring Boot Microservices using ELK Stack?โ and its related concepts.
Why is Monitoring of an Application Becoming More Important?
Any organization doesnโt want to afford a single second of downtime or slow performance of the applications. Moreover, performance issues can harm a brand name and even in some cases convert into a revenue loss. Hence, in order to ensure apps are accessible 24/7, efficient and secure at all times, developers utilize the different types of data produced by their applications and also the infrastructure supporting them. This data, generally in the form of logs, becomes important in the monitoring of these applications and the identification and resolution of any occurring issues. An organized logging plays an important role in fixing production time issues.
Before going through the topic โHow to monitor Spring Boot Microservices using ELK Stack?โ, letโs understand basic details of ELK Stack.
What is ELK Stack?
ELK Stack is a log management platform. The word โELKโ is the acronym for three open source projects: Elasticsearch, Logstash, and Kibana all developed, managed and maintained byย Elastic. Furthermore, Elasticsearch is a search and analytics engine, based on the Apache Lucene search engine. Logstash is a serverโside data processing pipeline that ingests data from multiple sources simultaneously, transforms it, and then sends it to a โstashโ like Elasticsearch. Kibana is a visualization layer that works on top of Elasticsearch, providing users with the ability to analyze and visualize the data.
Why ELK Stack?
In todayโs competitive world, the applicationโs architecture has turned into microservices, containers and orchestration infrastructure deployed in the cloud, across clouds or in hybrid environments. Moreover, the absolute volume of data produced by these environments is constantly increasing. Therefore, manual analysis of data such as log analysis is becoming a challenge in itself. This is where centralized log handling and analytics solutions such as the ELK Stack come into the picture. Hence, it offers developers to increase the visibility they need and ensure apps are available and responsive at all times.
What is ELK Stack used for?
The most common usage of ELK (together, three different components) are for monitoring, troubleshooting and securing IT environments. Moreover, there are many more use cases for the ELK Stack such as business intelligence and web analytics. Moreover, Logstash takes care of data collection and processing, Elasticsearch indexes and stores the data, and Kibana provides a user interface for querying the data and visualizing it.
How to download and install ELK Stack (Elasticsearch, Logstash and Kibana)?
In order to use ELK Stack, we have to download all three software i.e. Elasticserach, Logstash and Kibana. Below is the steps to download and install them in your system.
1) Elasticsearch
1) Go to https://www.elastic.co/downloads/elasticsearch
2) Select an OS based link
3) Extract ZIP file to a location in your system
4) To start it, go to the bin folder and run below command, It will start on port : 9200
> elasticsearch.bat
2) Kibanaย
1) Go to https://www.elastic.co/downloads/elasticsearch
2) Select an OS based link
3) Extract ZIP file to a location in your system
4) Link Kibana with Elasticsearch : Open kibana.yml file fromย config/kibana.yml : uncomment below line
elasticsearch.hosts : http://localhost:9200
5) To start it, go to the bin folder and run below command, It will start on port : 5601
> kibana.bat
3) Logstashย
1) Go to https://www.elastic.co/downloads/logstash
2) Select an OS based link
3) Extract ZIP file to a location in your system
4) Go to bin folder and create one file โlogstash.confโ with some configuration. Some examples of this file are given in below link.
https://www.elastic.co/guide/en/logstash/current/config-examples.html
5) To start it, go to the bin folder and run below command
> logstash -f logstash.conf
How to monitor Spring Boot Microservices using ELK Stack?
Now, Itโs time to create a Spring Boot application and integrate it with ELK Stack. However, it doesnโt matter whether you are working on a Microsevices based application or a Simple Spring Boot application. Here, our focus should be to create log files and the content of the log files will be captured by logstash. We can even create a Simple Java application that creates a log file. Any ways, the process of integration will generally be the same. Letโs create a Spring Boot application and integrate it with ELK Stack step by step.
Step#1: Create a new Spring Boot Starter Project using STS
Letโs create a Spring Boot Starter project using STS. While creating Starter Project select โSpring Webโ, and โSprong Boot DevToolsโ as starter project dependencies. Even If you donโt know how to create a Spring Boot Starter Project, Kindly visit our Internal Link.
Step#2: Create a RestController
Create a RestControlller asย InvoiceController and write some methods that generate an ample amount of log messages to the log file as below.
import java.io.PrintWriter; import java.io.StringWriter; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RestController; @RestController @RequestMapping("/invoice") public class InvoiceController { private static final Logger logger= LoggerFactory.getLogger(InvoiceController.class); @GetMapping("/get") public String getInvoice() { logger.info("Entering into method getInvoice()"); try { logger.info("finding Invices"); throw new RuntimeException("Invoice not available"); } catch (Exception e) { logger.error(" Unable to find invoice" +e.getMessage()); e.printStackTrace(); StringWriter sw= new StringWriter(); PrintWriter pw= new PrintWriter(sw); e.printStackTrace(pw); logger.error("Exception is -: " +sw.toString()); } return "INVOICE"; } }
Step#3: Update application.properties
Update application.properties and provide the location of log file as below
logging.file.name=D:/ELK_Stack/elktest.log
Step#4: Create logstash.conf file
In this step, we will create a new logstash.conf file at the bin folder of your logstash installtion. For example, in our case, the location is โD:\ELK_Stack\logstash-7.13.3\binโ. We have created a sample file for java logs as below.
It generally contains three parts : input, filter, and output
1) input : indicates where to read from
2) filter : indicates what to filter
3) output : indicates how to provide output
input { ย ย file { ย ย ย ย type => "java" ย ย ย ย path => "D:/ELK_Stack/elktest.log" codec => multiline { pattern => "^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}.*" negate => "true" what => "previous" } } } filter { if [message] =~ "\tat" { grok { match => ["message", "^(\tat)"] add_tag => ["stacktrace"] } } } output { stdout { codec => rubydebug } elasticsearch { hosts => ["localhost:9200"] } }
Step#5: Run your application & ELK Stack
1) Run your Spring Boot Application
2) Run Elasticsearch : Go to bin folder and Use below command
> elasticsearch.bat
3) Run Kibana : Go to bin folder and Use below command
> kibana.bat
4) Run Logstash : Go to bin folder and Use below command
> logstash -f logstash.conf
Once you start the logstash, it will start parsing the log file and show like below traces.
How to test in Kibana Dashboard?
1) Go to Kibana UI : Open browser, and hit http://localhost:5601
2) Click on the Dashboard, and then click on โCreate new Dashboardโ. Please refer screenshots attached below.
3) Click on the โCreate Index Patternโ to provide one search index pattern
4) Enter some pattern in Index pattern name field such as โlogstash-*โ and click on โNext stepโ button
5) In the time field letโs select โ@timestampโ and then click on โCreate index patternโ
6) Now click on the left bar and select โDiscoverโ, you will get the data populated in the Dashboard.
Below screenshot shows the place where you can go to โDashboardโ and โDiscoverโ.
Once you click on โDiscoverโ, below results will apear in the Kibana UI.
We can also set the time duration as per our requirement as below.
How to search data in Kibana Dashboard?
Letโs discuss about some of the queries that we require while searching the results from the Kibana UI. As aforementioned, Elasticsearch is a search and analytics engine, based on the Apache Lucene search engine. It is completely open source and built with Java. In fact, Elasticsearch is classified as a NoSQL database. It means Elasticsearch stores data in an unstructured way. Hence, you could not query the data using SQL. The new Elasticsearch SQL project will allow using SQL statements to interact with the data. Being aware with the syntax and its variety of operators, will be helping you to query in Kibana UI.
We have two different ways of querying data in Kibana: either use the traditional Lucene Query Syntax or the most recent KQL (Kibana Query Language). If you are using Kibana 7.0 or later, Kibana Query Language is included as a default. However, we will discuss the basics for both approaches including examples. One language may be better for your requirement than another. It totally depends on the nature of a search or your individual experience. However, KQL has some limitations such as not supporting fuzzy or regex searches. Moreover, we may expect Elastic team to concentrate on expanding KQL in the future releases.
Search By Field (Lucence)
Querying with field names is the most popular way of filtering data from Elasticsearch. You might be searching for a specific field that contains specific terms. Then you can do it like below:
name:โSpecific termโย ย Exampleโ message: ERROR
The query above indicates that you are searching the term โERRORโ in the message field. It will return the results that have ERROR in the message field.
Free Text (Lucence)
The simplest form of querying data, just like a Google search.
Invoice โ returns results that include โInvoiceโ in any field โInvoice not Foundโ โ returns results that include โInvoice not Foundโ in any field
Boolean Operators (Lucence) :ย AND, OR,ย NOT
Like other programming languages, Elasticsearch also supports the OR, AND and NOT operators. Also, the meaning of these operators are same as any other programming language. Operators such as AND, OR, and NOT must be capitalized.
โฆ Invoice AND Found โ Will return results that contain both the terms Invoice and Found โฆ Error NOT Warning โ Will return results that contain Error but not Warning โฆ Exception OR Error โ Will return results that contain Exception or Error, or both
Ranges (Lucence)ย [ ], { }, :>, :>=, :<, :<=
Lucence supports multiple types of Range searches : [ ], { }, :>, :>=, :<, :<=
-
price:[2 TO 24] โ Will return results with price from 2 through 24, including 2 and 24
-
price:{2 TO 12} โ Will return results with any price from 2 through 12
-
price:>2 โย Will return results with any price greater than 2
Wildcards (Lucence) ย *, ?
-
pr* โ will return results that include values that start with โprโ, such as price and protocol
-
pr*e โ will return results that include values that start with โprโ and end in โeโ, such as price and prime
-
stat?s โ will return results that include values that start with โstatโ, end in โsโ, and have one character in between, such as status
Regex (Lucence)ย ย / [ ] /, / < > /
- /pri[mc]e/ โ will return results that include either prime or price
Kibana Query Language (KQL) was first introduced in version 6.3 and became available as a default starting with version 7.0. However, this new language was built to provide scripted field support and to simplify the syntax compared to the Lucene language discussed above.
Boolean Operators (KQL)ย ย AND, OR, AND NOT
Unlike Lucence Syntax, KQL Boolean operators are not case-sensitive. We can use โandโ, โorโ, โand notโ in place of โANDโ, โORโ, โAND NOTโ respectively. Also, โNOTโ is replaced by โAND NOTโ.
Exception AND NOT Error โ returns results that only include Exception, but not those results that include both Exception and Error
Exception and not Error โ returns results that only include Exception, but not those results that include both Exception and Error
By default, โANDโ will have higher precedence than โORโ. Parentheses can be used to override this default.
Exception and (Error or Warning) โ returns results that include Exception and either Error or Warning
If we use โNOTโ before a search term, it will revert its meaning.
not status:โon Holdโ โ returns results that do not have on hold listed as their status
We can also revert entire groups by using parentheses.
not(name:Michael or location:โWashington DCโ) โ returns results that do not have Michael as the name or Washington DC as the location
Search By Field (KQL)
message: Error โ returns results that have Error in the message field
message: โInvoice Unavailableโ โ returns results that have 'Invoice Unavailable' in the message field. Here, the value โInvoice Unavailableโ is in quotes so that the search includes the words Invoice and Unavailable in the given order. Without the quotes, the results would also include Unavailable Invoice.
Searching a single field for multiple values is also possible in KQL as below.
message: (โFoundโ or โNot Foundโ) โ returns results that have either Found or Not Found listed as the message
location: (Chicago and New York and London) โ returns results that have all three Chicago, New York, and London listed as locations
Free Text (KQL)
The simplest form of querying data, just like a Google search and Lucence Syntax.
Invoice โ returns results that include โInvoiceโ in any field
โInvoice not Foundโ โ returns results that include โInvoice not Foundโ in any field
Great
Hola gracias por sus tutoriales, de ellos he aprendido